Monday, March 28th, 2011 - 13:16
Monday, March 28, 2011 - 13:13
By virtualizing their development and test environment, Agencies can take a first step into cloud computing which offers low risk and significant rewards.
One of the first steps in every IT development program is to assemble the development and test or “dev-test” environment; the computer equipment the development team will use while designing and building the system. Hardware is ordered, or backordered as the case might be. Then it is loaded into a rack, power connected, hard drives formatted, network cables and routers added. Then the operating system is loaded and configured, user accounts defined, and security controls set. Then the application software is loaded and configured and maybe some more security controls set. After perhaps 6-months, the development team can finally get to work.
This same lumbering process is repeated every time a new development project kicks-off. Worse yet, because each project buys and configures their own equipment; every dev-test environment is different. The dev-test hardware is usually not located in a data center, but is sitting on a table in the office space used by the development team. So they are saddled with maintaining their own system. Since the development team is focused on development, they are probably specialists in accounting software or databases or websites; not operating systems. So nobody is tracking which security patches have been applied or whether the operating system is up to date.
Virtualization technology offers an alternative to this traditional way of constructing a dev-test environment. Virtualization can be thought of as a computer simulator; a program which emulates hardware. In a virtualized computer, programs are not running on hardware, they run on top of another program which is simulating hardware. With virtualization, we don’t wait months for hardware to arrive; we just tell the program to simulate another computer. So new “virtual machines” can be added in minutes not months. And with virtualization we can configure our virtual machine one time, store it, and then we can replicate dev-test environments from that copy as needed.
The dev-test environment is typically a very light load for a computer, so one real machine can support many virtual machines, allowing the cost to be shared among several development programs. This obviously represents cost savings at the outset of a program. Less obvious are the continuing savings to the development team. Since a virtualized dev-test environment can be centrally managed, they can focus on their development task; they are not burdened with maintaining the operating system or underlying software. And because it is centrally managed, we can be sure that the dev-test environment has not been changed from the stored, “gold master” version. Studies find that 30% of software defects can be attributed to incorrect configuration of test systems which are "out of sync" with production environments.
Obviously the virtualized dev-test environment allows much faster system development because it avoids the 6-month delay waiting for hardware and configuration. But it also speeds system development in other less obvious ways. For example, staff can easily move from one project to another with no learning curve. So projects can staff-up quickly, or staff can be re-deployed to assist a project in trouble. And development teams can reuse components developed for other projects. For example, an agency might develop a standard dashboard application that is used across all their systems.
Systems developed in a common, virtualized dev-test environment will also be cheaper to operate and maintain. Since they were developed in a shared cloud environment, they can share the production environment too. Also, just as the development staff is movable between projects, so the operations staff can also support multiple systems in production. This allows the staff member to maximize their value by applying their particular skill across many programs. And it also allows the CIO to leverage skills that are in short supply. For example, a single cyber security expert can more easily maintain several systems if the underlying operating systems and software are identical.
Finally, the agency’s dev/test environment is a low-risk candidate for an agency’s first step into cloud computing. By definition it does not support any operational mission systems, so impact is minimal if there is a problem. And deployment can be planned into future activities while ongoing projects can proceed on their current path.
Jeff Koch is an Associate Partner in IBM’s Strategy and Innovation consulting practice providing thought leadership on government management and the intersection of technology and mission. Prior to joining IBM, Jeff was at the Office of Electronic Government and Information Technology at the Office of Management and Budget (OMB) where he managed 12 government-wide E-Gov programs, and played a key role in the Federal budget and legislative process.
Jeff also worked for the Chief Information Officer at the U.S. Department of Labor where he managed the Benefits.gov program and provided leadership on IT budget and information policy. Jeff also worked as the Chief of Staff for a Member of Congress, and as an engineer for a large defense contractor where he designed radios for military communications and surveillance systems.