Trusting Measures

 

Trusting Measures

Monday, December 19th, 2011 - 13:38
As time has gone on, the gathering of performance data has become increasingly common. But there’s still a major hole.

As time has gone on, the gathering of performance data has become increasingly common. But there’s still a major hole. Even entities that produce carefully prepared performance reports often don’t use them in the legislative process. More commonly they are used within agencies to make managerial improvements. But when it comes time to shifting policy direction or debating funding, measures are all too often ignored.

Why is this? Many experts point to politics as usual. And it’s always frustrating when political exigencies get in the way of good, data-based management. But there’s another big reason: Legislators and other observers question the validity of the performance measures themselves – not just whether the right thing is being measured, but whether the resultant data holds up under scrutiny. 

Christopher Mihm, the managing director for Strategic Issues at the U.S. Government Accountability Office puts the point rather succinctly: “When decision makers don’t believe the performance data are accurate, that can take you down a road of recriminations and doubt that undermine an entire effort."

Even when bad data is released thanks to absolutely innocent clerical errors, the impact on credibility can be numbing. In the first rounds of the Recovery Act’s data reporting about compliance, the media made hay out of examples of simple data entry errors—people accidentally putting in their state legislative district rather than their US congressional district, or even simple typos that wildly threw off the figures. These stories got blown out of proportion and accusations abounded that there was a purposeful effort to mislead the public and its representatives. In fact, according to the Recovery Board’s executive director Mike Wood, “The amount of non-compliance was extremely low."

 

Lowballing murders; Highballing test scores

Of course, there are ample examples of instances in which the innocence of the bad numbers is less clear. Consider this example from 2008, as reported by the Detroit News, “The Detroit Police Department is systematically undercounting homicides, leading to a falsely low murder rate in a city that regularly ranks among the nation's deadliest, a Detroit News review of police and medical examiner records shows.

“The police incorrectly reclassified 22 of its 368 slayings last year as "justifiable" and did not report them as homicides to the FBI as required by federal guidelines. There were at least 59 such omissions over the past five years, according to incomplete records obtained from the police department through the Freedom of Information Act.”

We can’t assume that the issues in Detroit were the result of any purposeful effort to mislead. . But sometimes, that’s clearly the case. The television program, The Wire presents a classic example of how this can work in the schools. One character on the show describes the effort as “curriculum realignment,” an attempt to make sure that test scores are as high as possible; even if that’s not in any way reflective of how educated the young people actually are.

That television program may be fiction. But here’s a similar, real-life example. Rajashri Chakrabarti of the Federal Reserve Bank of New York took a close look at Florida schools. The report, Vouchers, Responses, and the Test Taking Population: Regression Discontinuity Evidence from Florida, found that schools in Florida conscientiously placed weak students into categories that would not be included in test scores. The reason? Students in schools that scored particularly poorly could be eligible for vouchers allowing them to attend private schools; and that, in turn, meant that the public schools lost money and gained notoriety.

 

The Federal Government

Over the course of the last ten years or so, performance measures emanating from Washington D.C. have been subject to increasing scrutiny intended to make certain that the information itself is appropriate and genuinely useful. This effort was emphasized in the latest version of the Government Performance and Results Act. “There’s going to be a tougher discussion not just about performance but the performance measures they’re going to have,” says Don Moynihan, professor of public affairs at the La Follette School at the University of Wisconsin.

This is an important first step, of course. Whether data is accurate or not doesn’t matter much if the data isn’t helpful. But what about accuracy? This is kind of a scattershot proposition. GAO can address the general topic. But, of course, it has no direct responsibilities to verify or validate performance data. Inspectors General may be more likely to look at individual performance reports, but they don’t have any direct responsibilities either. This leaves the task in the hands of the agencies, for which the Office of Management and Budget has produced guidance. But even there, the emphasis is on describing credible procedures to ensure accurate performance information, which doesn’t mean that any individual agency has any real audit process in place.

Interestingly, the new version of GPRA has made the validation of the data more important than ever. The Act has heightened the stakes for meeting goals, and that unquestionably increases the potential for fudging.

 

The road to validation

Lest this sound like an entirely bleak column, this feels like an entirely appropriate moment to point out that some government entities are deeply focused on validating data, in large part to ensure that measures achieve the credibility necessary to foster positive change.

The ICMA’s Center for Performance Measurement, for example, allows local governments that are members of the organization to submit performance measure data in up to 18 different performance areas. ICMA then scrubs the data—looking for outliers or things that seem wrong. If something does seem amiss, they follow up with the government to let them know. About 170 government are participating – a pretty solid number.

One great benefit of the ICMA’s service is that it allows small communities to have the benefits of a validation process that they couldn’t necessarily afford on their own. Moreover, ICMA offers  jurisdictions an opportunity for communities to see how they stack up to their peers. It’s risky business, after all, to benchmark on communities without having a sense whether their own data are accurate.  “If you are going to make some assessments of how you are doing relative to others you need to have confidence not only in your own data, but that the data of others are accurate," says Mike Lawson director of the ICMA Center for Performance Measurement.

Other entities have embarked on potent validation programs of their own. Maryland’s “managing for results” audits stand out in this regard. According to Bruce Myers, the state’s legislative auditor, “We are systematically auditing the results of the 62 [Managing for Results] measures contained in the 2005 Managing for Results – State Comprehensive Plan, which was produced by the Department of Budget and Management.” 

The results? The audits were only able to certify about half of the state’s 62 MFR measures as valid, often because there was a simple lack of documentation or data hadn’t been retained.

In a state like Maryland – which stands out for its work in using performance measures, this kind of care is critical. Myers agrees with others that, generally, the validation of performance information is at embryonic stages. As Lisa Parker, project manager at the Governmental Accounting Standards Board, told us, “I definitely don’t think that performance data is perceived as being at the same level as financial data. We have a long way to go before we are at that point.”

A few reasons why, according to Myers:

  • Fiscal information usually is derived under an established set of rules.
  • Fiscal information can frequently be backed up with routine examination of invoices and receipts.
  • There is a discipline with financial information that means that an auditor examining expenditures, for example, can go to almost any office in the county and find consistent records.
  • With performance information, most entities start from scratch with validation efforts. Financial data has been audited for decades.

Image credit:  Wikipedia

Photo: