The Pandemic Response Accountability Committee oversees a bailout about three times as large as what Congress spent on the 2008 recession, but also benefits from...
The Pandemic Response Accountability Committee oversees a bailout about three times as large as what Congress spent on the 2008 recession recovery, but also benefits from advances in data analytics tools that weren’t available to auditors more than a decade ago.
The coronavirus pandemic, however, has uncovered longstanding gaps in the federal government’s data-sharing infrastructure as much as it demonstrated deficiencies in its response plan for public health emergencies.
To oversee about $3 trillion in spending, Robert Westbrooks, PRAC’s executive director, said the committee’s goal is to empower as many “citizen watchdogs” who can scrutinize the data and provide feedback to the committee — as well as provide leads on fraud, waste and abuse.
In designing a new website to track this huge amount of money, Westbrooks said the PRAC spent a great deal of effort making the website accessible to a wide range of users with varying levels of data literacy.
“We want the average citizen to be able to look at the data and find what they need, and we want the power users like Hill staff and public interest groups that are looking for granular data to have access to that as well,” Westbrooks said Thursday during a virtual event hosted by Government Matters and the Data Coalition.
Most of the data on the committee’s website comes from USASpending.gov. Westbrooks said the committee received its first round of data in July, which contained more than 6 million rows of data coded as disaster funding.
However, Westbrooks said data from a single source “doesn’t tell the whole story” around COVID spending. Missing data elements include the Paycheck Protection Program overseen by the Small Business Administration and recipient-level data for $150 billion in state funding. The PRAC, he added, is also seeking additional data from the Departments of Labor and Health and Human Services.
Fortunately, the PRAC also has access to more sophisticated technology than what its predecessor, the 2009 Recovery Accountability and Transparency Board, had to flag and investigate misuse of funds.
Westbrooks said the PRAC has access to data-scraping tools, as well as artificial intelligence and machine-learning tools, and also relies on Oversight.gov as a central hub to publicize reports from agency inspectors general.
While the PRAC has borrowed some of the best practices of the RAT Board, Westbrooks said the committee is also looking to leave behind a playbook to oversee future disasters.
“We don’t want to just close the door and shut out the lights after five years. We want to leave an enduring mark on federal disaster spending, because sadly we all know this is not the last disaster this country is going to see, and we need to plan for this and plan accordingly,” Westbrooks said.
To handle day-to-day data wrangling, the PRAC recently hired Brien Lorenze, a former principal at Deloitte, as its chief data officer. But across the rest of the government, agency chief data officers have been working throughout the year to improve the data maturity of their organizations under the Federal Data Strategy.
Ted Kaouk, the chief data officer at the Agriculture Department and chairman of the governmentwide CDO Council, said the Federal Data Strategy is helping agencies better understand best practices on data sharing and building a data-centric workforce.
“By having a community of practitioners who know how to drive change across organizations, to make them more data-driven and ensure we’re fully leveraging the power of our data … We can help leaders understand what’s possible today with data analytics and machine learning and to some extent, how to work with us,” Kaouk said.
CDOs, he added, can improve “data acumen” for agency executives, to help them better understand what insights are possible with the data they already have.
The CDO Council is the largest federal panel of its kind, with 80 members from large and small agencies. Kaouk said the council has been working to meet individual agency action items under the Federal Data Strategy, but is also conducting surveys to understand where agencies can come together for “cross-cutting” work.
“Obviously, USDA will have different needs than maybe smaller agencies, but we’re working together on defining what those various challenges are. I think to a person, everyone is really thinking about how they can make an impact, and by having those conversations and discussion sessions where we’re having success and also sharing, where we’re having challenges, we’re able to work through them more effectively,” Kaouk said.
Chris Haffer, the CDO of the Equal Employment Opportunity Commission, said the strategy has stressed the importance of investing resources into data analytics even at smaller agencies. The EEOC, for example, has retooled a former data processing office into one that now develops predictive analytic models and pulls new insights out of agency data.
Based on data from previous economic downturns, Haffer said his office has preliminary evidence suggesting discrimination filings will increase over the next six-to-12 months.
His office has also using discrimination charge filing data to explore trends in demographic, geographic, and industry data, as well as trends in types of alleged discrimination, to identify which groups face the highest risk of experiencing discrimination during the pandemic’s economic downturn.
“This type of predictive analytics, which is new to the EEOC enables us to proactively ensure that EEOC has the right number of people with the right skills, in the right place, at the right time to help both prevent employment discrimination when possible, and to remedy through enforcement, when necessary, illegal employment discrimination,” Haffer said.
Haffer said his office is looking at machine learning to push further in this work, and that demonstrating success with this discrimination forecast has helped demonstrate the need for these tools.
“The concept of machine learning for small agencies is somewhat of a foreign concept. So under the mantra of you have to go to crawl before you walk, it’s demonstrating successes of data and analytics, ultimately, leading to a point where we’re able to do some of the 21st-Century machine learning models to help make the work of EEOC more efficient and effective,” he said.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Jory Heckman is a reporter at Federal News Network covering U.S. Postal Service, IRS, big data and technology issues.
Follow @jheckmanWFED