[Pictured Left:The Roundtable brought together leaders from the EU, NATO, DHS, State Department, and other stakeholder and EU member state officials.]
The Brussels discussion focused on how the the EU and other European organizations and member states can work the Department of Homeland Security, Department of State and other US agencies, to best enable a trusted environment for sharing information.
Lieutenant Colonel (LTC) Jeff Adams is a data analyst in the Army serving as a Research Fellow in the Army's Training with Industry program, through which he works with IBM for one year before returning to the Army. His research fellowship is intended to help him learn how industry analyzes big data and communicates strategic insights to senior leaders to take this knowledge base back to the Army. LTC Adams worked with the IBM Center for several months.
From the OMB “Cloud First” strategy, to GSA’s Federal Risk and Authorization Management Program (FedRAMP), the government is following commercial best practices to leverage the cloud.
Cloud capabilities can be provided over the public Internet or through connections over private networks -- and government does both. Some agencies establish private clouds due to perceived risks of making data available over public channels. At the same time, they are moving toward greater use of the open Internet, including public clouds.
In a panel discussion (watch the video) led by the Partnership’s Judy England-Joseph, three government leaders detailed the lessons learned from their experience that can help other agencies in the sound use of analytics to make decisions. Specifically: Social Security Administration (SSA) - Gerald Ray runs the Disability Appeals process. He observed that the disability review process required significant knowledge of regulatory compliance as well as the specifics of each individual case.
Brian Murrow, an expert on strategy and analytics at IBM, participated in interviews conducted by the Partnership for Public Service as they prepared a series of podcast conversations with pioneers in the use of analytics in the federal government. In a series of guest blog posts over the next few weeks, Brian will share his key takeaways from these interviews. You can also listen to the full interviews yourself if you find yourself wanting to know more.
In the late 1980’s and early 1990’s, the Food and Drug Administration (FDA) faced a mountain of criticism. It was thought that the public health safety precautions built into its drug evaluation procedures in reaction to the Thalidomide tragedy two decades earlier were responsible for delaying consumers’ access to vital new drug therapies. Particularly in light of the growing activism around fighting AIDS, critics argued that the FDA procedures were born out of disaster and therefore extremely overcautious.
When Carter Hewgley joined FEMA in 2011, the organization was focused on two things, the timely delivery of services and the processes required to collect and organize all the resources to support those services. FEMA was a “disaster-driven” organization, more focused on responding to the next emergency, rather than reviewing the lessons learned from a previous emergency. Although there were “analytical cells” across the agency and programs, enterprise-level analytical capability was still at its infancy.