Mixing up cause and effect or finding effect when there's no cause, this way of thinking is particularly reckless for federal agencies.
Big data analytics, on the lips of nearly every federal CIO and program manager these days, is typically described in technology terms. You need storage, Hadoop, unstructured databases, analytic tools. That’s true, but it’s only half of what you need.
The technology pitch doesn’t describe an equally important requirement. Namely, people capable of building, operating and getting good information out of big data.
A few years ago, in one of my Forrest Gump moments, I attended a party in Boston where a crew from The Wall Street Journal was celebrating winning a Pulitzer Prize for a series of stories on backdating stock options to inflate their value to executives. A reporter had had a tip or some hunches but couldn’t prove them, given the volume of data and analysis needed to find out whether closely spaced events were just coincidence or evidence of something deeper. The paper’s reporters teamed with some smart programmers and data analysts. The resulting stories caused dozens of executive to lose their jobs or face federal charges.
People commonly put two plus two together and get five. If two planes crash within days, somebody will say, “Something’s going on!” Three people in the same town with cancer, it must be that factory. Mixing up cause and effect, or finding effect when there’s no cause — this way of thinking is particularly reckless for federal agencies with their power over public policy and how it’s carried out. Developing algorithms to predict car crashes or which prescription drug someone will buy requires care in order for interesting but meaningless factors to stay out of the equation.
Johan Bos-Beijer, director of the analytics and services office at the General Services Administration, cited an incident where a federal fraud investigator tossed a would-be contractor out of a meeting. That young sales presenter showed a Medicaid provider doing an X-ray in San Francisco, having it read in Dallas, and issuing the invoice from Delaware — and said, that’s got to be fraud. It’s clear, the sales person didn’t have much domain expertise about large health care provider organizations.
That’s why making sense of data is an interdisciplinary deal. You need data scientists and programmers, but you also need people who know something about the domain you’re working in. On a panel at the GITEC 2016 Summit in Baltimore, Bos-Beijer put it this way: For successful analytics, first you need to identify the functions for which you’re analyzing data. It might be customer service, it might be fraud, it might be acquisition. Regardless, you’re going to obtain a more accurate approach if you get the subject matter experts involved, to work alongside the techies.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Tom Temin is host of the Federal Drive and has been providing insight on federal technology and management issues for more than 30 years.
Follow @tteminWFED