Mousetraps for Flawed Data
For the most part, we’ve pointed to issues that require careful examination of the information to determine if its trustworthy or not.
But, as time has passed, we’ve come across a great many signals, easily spotted and identified, that point to quicker recognition that information should be scrutinized. Here are a half dozen examples:
2) Sometimes reports or articles use numbers that are so precise as to be unbelievable. It seems to us that when project spending is reported as $1,436,432.15, there’s no legitimate way to figure costs out to cents, dollars or hundreds of dollars. A tight range is often more useful and believable.
3) Speaking of ranges, it’s self-evidently problematic when an expense is reported as somewhere between $100 and $500 million. Either not enough due diligence has been done, or the estimators are living in the Land of the Wild Guess.
4) If you’re relying on data for which no assumptions are provided dig deeper. When discount rates vary between two state pension plans, it’s entirely possible that the liability figures are not comparable.
5) Watch out for figures that are huge beyond common sense. Some years ago, there was a lot of talk about one million children being abducted each year. Living in New York City, news reports were full of the story of just one little boy, Etan Patz who was last seen at a bus stop in lower Manhattan. How could it be that if such huge numbers of children were disappearing, one child was getting so very much attention? It turned out, according to the Denver Post in 1986 that the “national paranoia" raised by the one million figure wasn’t really reflective of scary men luring children into their cars with candy – but rather children taken in a custody battle.
(And the often repeated one million figure was also an exaggeration. In 2017, the Justice Department reported that the number of serious parental abductions is closer to 156,000 annually of which about 30,500 reach the level of a police report.)
6) Information that is self-reported to the federal government by states or by cities and counties to the states can he questionable. A question like “does your city use performance information,” can get “yes” answers regardless of differing definitions and degree of use. In the past a big-city mayor told us that his community used measurements to make decisions about road conditions. When we pursued the question, it developed that the only data the city had was an accumulation of citizen complaints about potholes.
** This blog first appeared on the Barrett and Greene website.