- Radio hour
- About us
I recently discussed the significant benefits and low risk for agencies adopting a shared virtualized development and test (dev-test) environment. Until now, each project team has been saddled with building and maintaining their own dev-test environment. Lacking any centralized planning, development teams made sometimes arbitrary selections for the operating environment which resulted in each system operating as an island; with dedicated facilities and staff. By migrating development activities to a shared common development environment, made possible through virtualization technology, these programs can share staff, speed development, and reduce the cost of both development and operations. However, while the benefits of such a move are compelling, this move to the cloud, like any shared service, can present significant hurdles for agencies, both technical and managerial.
Before moving to the cloud, organizations must answer two critical questions. The first, which I'll address in this post, is about location. The second, which I'll address tomorrow, is governance.
The first question many agencies ask is whether their virtualized dev-test environment needs to be in their own data center, or whether they should buy this as a service from some remotely located cloud service provider. Until now, most development teams just used a computer server sitting on a table in their office space. If there was a problem, they could just reach over and re-boot the system. And the system could be “air-gap” isolated on a physically separated network. Or it might have one cable running to one router.
Developers like this ability to visually verify isolation because of the nature of development; programs are running which may have serious security holes. By definition, the dev-test environment is a laboratory; programmers are experimenting, inventing; writing and re-writing code. First drafts of programs are just like the first draft of a term paper, full of typos and obvious errors; they are buggy. Programs are crash tested and fail dozens of times. Indeed, programmers will purposefully feed bad data into a program to make sure the program crashes in an acceptable manner; that it does not fail in such a way as to disable other systems or expose data.
Given this purposeful chaos, even individuals who feel no compunction about running applications in the cloud may wonder whether perhaps the development environment would be safer in their data center. After all, in a cloud environment, the routers and firewalls and cables may be virtual, just like the machine under test. It seems obvious that such a test environment should not be shared; it would be like crash testing cars on the public streets.
But this is not the case. It may be counter intuitive, but sharing the test environment does not interfere with other users. Because no matter how spectacularly a program crashes, it is encapsulated within the virtualization program. Indeed, the virtualized dev-test environment works better than the old fashioned dedicated hardware model because each developer can have their own virtual machine; another developer’s work won’t interfere. And if the developer needs to reboot their machine, it takes only seconds.
However, there is a good reason for an agency to consider hosting their own virtualized dev-test environment in their data center; to take a first step into cloud computing and to evaluate the role of being a cloud provider versus a cloud consumer. The dev-test environment is a low-risk candidate because it does not support any operational mission systems, so impact is low if there is a problem. What’s more, a lot of what you pay for with a commercial cloud computing provider (e.g. wide bandwidth, huge capacity, managed services) is not very valuable in a dev-test environment where the system is only supporting a handful of users and test data.
The easiest way for an agency to establish a virtualized dev-test environment in its own data center is to buy a “cloud appliance;” a rack of hardware with software pre-loaded and configured. These are available in different sizes from several vendors. Alternatively, an agency can build their own cloud infrastructure, either repurposing equipment already on hand or buying new, and using either a pre-packaged software stack or selecting the software components individually. In fact, those who are technologically curious and enjoy such things (aka “geeks”) can gain familiarity with building a virtualized system on their home computers. Most of the major Linux distributions now include open source software for several types of virtual machines and the tools to administer them. And installing a modern Linux distribution is easy, either on an old unused PC or as a dual boot. While your agency may eventually choose to standardize on a different set of tools, the concepts and functions will be the same.
Jeff Koch is an Associate Partner in IBM’s Strategy and Innovation consulting practice providing thought leadership on government management and the intersection of technology and mission. Prior to joining IBM, Jeff was at the Office of Electronic Government and Information Technology at the Office of Management and Budget (OMB) where he managed 12 government-wide E-Gov programs, and played a key role in the Federal budget and legislative process.
Jeff also worked for the Chief Information Officer at the U.S. Department of Labor where he managed the Benefits.gov program and provided leadership on IT budget and information policy. Jeff also worked as the Chief of Staff for a Member of Congress, and as an engineer for a large defense contractor where he designed radios for military communications and surveillance systems.