Integrating & Analyzing Data in Government—the Key to 21st Century Security: Observations from Brussels

[Pictured Left:The Roundtable brought together leaders from the EU, NATO, DHS, State Department, and other stakeholder and EU member state officials.]

The Brussels discussion focused on how the the EU and other European organizations and member states can work the Department of Homeland Security, Department of State and other US agencies, to best enable a trusted environment for sharing information.

Jeff Adams

Lieutenant Colonel (LTC) Jeff Adams is a data analyst in the Army serving as a Research Fellow in the Army's Training with Industry program, through which he works with IBM for one year before returning to the Army. His research fellowship is intended to help him learn how industry analyzes big data and communicates strategic insights to senior leaders to take this knowledge base back to the Army. LTC Adams worked with the IBM Center for several months.

Unpacking the "Black Box" of Incident Reporting

While data can be used externally for accountability, it can also be used internally to predict and prevent these kinds of incidents.

These days, more detailed, near real-time data can be collected because of improvements in technology and new reporting systems.  However, these more detailed data – if not well-explained and put in context -- can alarm the public and cause political problems, even while improving performance.  Recent examples include:

How government can securely leverage cloud environments

From the OMB “Cloud First” strategy, to GSA’s Federal Risk and Authorization Management Program (FedRAMP), the government is following commercial best practices to leverage the cloud.

Cloud capabilities can be provided over the public Internet or through connections over private networks -- and government does both. Some agencies establish private clouds due to perceived risks of making data available over public channels. At the same time, they are moving toward greater use of the open Internet, including public clouds.

The DATA Act and Transparency: 4 Ways that Industry Will Benefit

Late last week, the President signed into law the Digital Accountability and Transparency (DATA) Act.  As summarized by the Administration’s release statement, the DATA Act will

Making Data Real – Lessons From and For Federal Leaders

In a panel discussion (watch the video) led by the Partnership’s Judy England-Joseph, three government leaders detailed the lessons learned from their experience that can help other agencies in the sound use of analytics to make decisions. Specifically: Social Security Administration (SSA) - Gerald Ray runs the Disability Appeals process. He observed that the disability review process required significant knowledge of regulatory compliance as well as the specifics of each individual case.

Making Data Real – Weekly Insights

Brian Murrow, an expert on strategy and analytics at IBM, participated in interviews conducted by the Partnership for Public Service as they prepared a series of podcast conversations with pioneers in the use of analytics in the federal government. In a series of guest blog posts over the next few weeks, Brian will share his key takeaways from these interviews. You can also listen to the full interviews yourself if you find yourself wanting to know more.

Malcolm Bertoni, FDA: Conversations on Using Analytics to Improve Mission Outcomes

In the late 1980’s and early 1990’s, the Food and Drug Administration (FDA) faced a mountain of criticism. It was thought that the public health safety precautions built into its drug evaluation procedures in reaction to the Thalidomide tragedy two decades earlier were responsible for delaying consumers’ access to vital new drug therapies. Particularly in light of the growing activism around fighting AIDS, critics argued that the FDA procedures were born out of disaster and therefore extremely overcautious.

Carter Hewgley, FEMA: Conversations on Using Analytics to Improve Mission Outcomes

When Carter Hewgley joined FEMA in 2011, the organization was focused on two things, the timely delivery of services and the processes required to collect and organize all the resources to support those services. FEMA was a “disaster-driven” organization, more focused on responding to the next emergency, rather than reviewing the lessons learned from a previous emergency. Although there were “analytical cells” across the agency and programs, enterprise-level analytical capability was still at its infancy.