Why We Are Where We Are: A Short History of Federal IT

shareprint

Why We Are Where We Are: A Short History of Federal IT

Tuesday, February 8th, 2011 - 21:59
Tuesday, February 8, 2011 - 20:39
The Federal Government has a split personality when it comes to IT. In some areas it is a bleeding edge leader in the development and use of IT. But in many others, it lags behind the private sector in the broad implementation of IT commercial best practices. This post provides a short history of Federal IT and insight on why the Federal Government has the IT infrastructure that it currently does.

The Federal Government is the largest data processing / IT organization in the world. The shear number of computer centers, unique applications and support staff dwarfs even the largest private sector IT organizations.  With this scale comes a significant opportunity to enhance service performances and lower operating costs associated with physical assets such as computer equipment, data center space, staffing and overall investment. 

To address the needed improvements and efficiencies, it is important to understand the government information technology & organizational environment and history as to how this evolved. The first step in understanding the history of federal IT and how IT consolidation cost savings can be realized, is to look at how the government itself is organized and the structural impediments to implementing consolidated IT operations.  The number of departments, agencies, quasi-independent agencies, and regulatory bodies, all with their own specific mission and core business functions, makes governance more difficult.  Separate contracting, procurement, budgeting and PMO offices makes selecting vendors and implementing projects more challenging.  Last, but not least, the difficulty of creating a central leadership team with the authority and power to drive the overall and agency specific consolidations and re-engineering of Federal IT, means that transforming and consolidating federal IT will be a significant long-term endeavor.  But the potential payoff, in reduced costs and improved performance, is enormous.

The current state of Federal IT, with thousands of servers, hundreds of independent systems and applications across dozens of agencies, is NOT the result of poor planning. The Federal Government was one of the very first major users in computer systems.  They spent significant dollars to acquire the very best in data processing equipment and they geographically co-located that equipment with its end use.  This meant placing equipment around the country and around the world, along with the necessary support staff.   Commercial industry followed the same path.  Co-location minimized risks and increased the utility of the IT infrastructure by decreasing the distance from computing power to end-user – an important consideration in the era before the internet and high speed communications. 

Along with hardware and software, technical support staffs were placed at centers so that each agency operated as a quasi independent company.  This was the application of the best practices in effect during those periods of time. 

And once the basic structure was set, organizational inertia set in.  The result is the buildup of systems, networks, processes and IT support organizations that we see today.  As we assess the federal governments IT business model, we find: duplicated systems, duplicated and inconsistent data, longer and more complicated implementation and system change times, and operating costs at unacceptable levels.  Things are changing, but this business model is in place in most of the many different DOD and Civilian.  In short, the Federal Government IT business model needs to be updated to meet today’s challenges and opportunities.

Conceptually these challenges are parallel to the challenges that commercial industry, including IBM, faced in the 1980’s – 2000’s.  But, because the quantity of systems is so large, and the size and nature of government business is so different across each of it agencies such as DOD, DOJ, SSA, Treasury, HHS etc, the magnitude of the Government Challenge is larger and more complicated than any commercial endeavor.  The Federal Government has been trying to follow Commercial best practices, and reap the attendant benefits.  The federal government didn’t get into its current situation overnight, and it won’t revamp its IT infrastructure overnight.  The size and complexity and differences in business operations and leaderships in the many different agencies, and the lack of a single IT lead make the challenges that much greater.  But there are a number of steps that can be taken, at all levels within and across organizations.

Major projects have been initiated at major agencies and successes have been realized at agencies such as IRS. However there is the potential for saving hundreds of billions of dollars,  improving the performance of federal IT, and easing the business processes that IT supports.  The challenge is how to accelerate the pace, and scale up the efforts to realize the $150B - $200B in savings that can be realized by consolidating and upgrading the federal IT infrastructure.  Keep your sleeves rolled up – there is a lot of work to be done!

**********************************************************************************************

Mr. Danto has more than thirty years experience in identifying underlying information technology and business operational problems, designing and delivering workable solutions, and optimizing operational performances.  

Mr. Danto is an recognized speaker on the Effective Use of Information Technology to Achieve Business Process Reengineering Goals and Migration to Shared Services. He has been a guest speaker at re-engineering conferences including the National Reinventing Government Conference, National DoD Logistics Conference, GSA’s IRMCO and IPIC Conferences. He has also been referenced in three national publications.