As every agency tries to make sense of the ever-growing mounds of data it collects, this big data challenge is giving them the opportunity to experiment and even take some smart risks.
At the departments of Agriculture, and Housing and Urban Development, those pilots are paying real dividends to improve program performance.
Jenny Rone, the assistant inspector general for data science at USDA, said over the past year, her staff has thrown the proverbial spaghetti against the wall to see how they can make the data work for them.
“The majority have been done with open source data that not only challenges my staff’s ability with creative problem solving, but the way in which our constituents see the world and the possibilities,” Rone said at the recent Data Driven Government conference. “Most of what we’ve done focused on self-service business intelligence tools. For instance, when we received funds to oversee the 2017 disaster season, we built maps with geospatial information systems (GIS) tools. We took GIS coordinates from FEMA and we created the base based on those disaster declarations. We then took contract data from Federal Procurement Data System (FPDS) and overlaid that based on the national interest action code. Our auditors and investigators could see the big picture and they were able to drill down on specific areas and individual contracts, which aided objectively scoping our oversight activity.”
Rone said by combining data sets that were disparate along with the GIS data it let investigators identify risk using 10 indicators.
At HUD, using data to measure risk was at the heart of several proof-of-concepts. Larry Koskinen, the department’s chief risk officer, said the agency received an innovation grant from the Treasury Department to help make better sense of A-133 single audits.
“We needed a way to look beyond the block grant in an objective and transparent way that wouldn’t subject us to criticisms of bias, and see if we can use words rather than numbers to analyze where we had potential problems and opportunities for increased oversight on the back and the front end of grant and contract instruments,” Koskinen said at the event, which was featured on Ask the CIO. “This has proven useful. We are using computational linguistics, machine learning and sentiment analysis.”
Subject matter experts matter more than data
Koskinen said this approach is a “data scientists dream” because they were able to “ground truth” the information to get all the right risks in the right order.
“Suddenly I went from zero money for analytics to now over $1 million for this year,” Koskinen said. “We will be working on this nexus of heart and head as we work both sentiment analysis and numeric analysis as we move forward.”
Both Rone and Koskinen said the “secret sauce” isn’t the technology — though it does play a big role — rather it’s the mission and program people being equal partners in the efforts.
Koskinen said if the subject matter experts trust the data scientists, then the resulting effort would be ground breaking.
Rone said one big lessons learned is to engage the subject matter experts early and often, and be willing to make changes as needed.
“You need those champions. The best thing we did in launching into 2020 was that we invited potential champions to come to our professional development training for a week where I leveraged our own community to inspire,” she said. “We left there with this roadmap for the future because those people are now going back into the audit side or the investigation side and say, ‘Oh my gosh, I had no idea that they could help us do this.’”
Rone said her 2020 goals including expanding that initial proof of concept and then beginning a new one around contract risk and perhaps grant risk.
“The contract fraud, in particular, is an area our investigators really want to proactively focus on. They are excited about the potential,” she said. “I am a big proponent of open source. The community out there is extremely supportive. You can grab algorithms, you can throw questions out. It’s amazing. Usually if you have a question about something, it’s already been done in the community.”
Koskinen said he believes analytics will help auditors and others break out of the old way of doing things.
“The folks in the [inspector general] world that I work with, especially on the audit side … are looking for ways to move away from episodic compliance auditing, which audits us firmly back into the 20th century, which is really where we want to be, right? Instead of thinking forward into the 21st century what the new high-performing, citizen-oriented, result-oriented governing model is as opposed to the 19th and 20th centuries government model is,” he said. “It’s the human being that are at the core of that.”
He said through this data analytics effort HUD is looking for programs in distress as well as those that are running well. Too often in the audit committee, the latter of the two reviews are overlooked.
“I can tell you what the profile of high performing look like with as much ease as the profile of programs with performance problems,” Koskinen said. “We are having a very fulsome conversation now about what we are doing right, who are our benchmarkable assets within the organization? It’s a huge and very fertile opportunity for us to show our grantee constituents what good looks like so they can model behavior.”
To help internal programs, HUD’s Office of Risk Management and Assessment has create new positions that have both a budget and risk background, called risk budget analysts. Their job, Koskinen said, is to calculate the financial impact of a risk and risk mitigation effort, and whether the program office has the ability to fund that mitigation itself, and if not, what is the delta. The office then follows up with the program mangers when the risk mitigation effort begins.
“You are changing the mechanics in a way that then people get swept along. It’s important to capture the hearts and minds, but when you have then by the budget, it creates a dynamic that is really unstoppable,” Koskinen said. “The risk management process applies some continuity to a program’s well-intended insights that make their up into leadership conversations. In over 40 years in government … I’ve seen a lot of these good government initiatives, but they tend to peter out when the key stakeholders leave. The thing about getting this stuff into a risk framework is it tends to have a life of its own with [Government Accountability Office], the IG, [Office of Management and Budget] all looking at it, making very hard to drop a good initiative when a new administration comes in.”