OIG ratings of agencies’ DATA Act compliance vary significantly

The 2014 DATA Act set a new standard for government transparency by raising the bar for how much spending data agencies have to make available via public websit...

Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.

The 2014 DATA Act set a new standard for government transparency by raising the bar for how much spending data agencies have to make available via public websites. But releasing more information doesn’t help much if the data is unreliable. The Government Accountability Office is out with a new assessment that looked at how more than 50 agencies are doing on the data quality front based on assessments by their own inspectors general. The results vary widely. Paula Rascona is a director at GAO’s financial management and assurance team. She talked with Federal Drive with Tom Temin about the findings.

Interview transcript:

Paula Rascona: The DATA Act called for the offices of inspector general for each of the agencies to evaluate their agency’s data quality for the files and data that they submitted into the USASpending.gov which is expected to display all of the federal spending by the agencies. And then GAO’s role was twofold: We were to work collaboratively with the inspector general community, and then we were also expected to issue our report every two years. Our first report went out in 2017. And our second report went out in November of 2019. And the report we’re speaking about today, is a capping report that compiles all the information on the results from the IG agency quality reviews.

Jared Serbu: So looking across that broad landscape, these 50-plus agencies, what can we say about whether or not agencies have improved since that first round in 2017? And what can we not say?

Paula Rascona: At a very high level, we can say that we’ve seen improvements in the number of agencies that are reporting. We have seen improvements in the – meeting the timeframes for reporting. And we have seen some improvements and some of the quality of certain data elements. But we can’t really make a one-for-one comparison to the fiscal year 2017 work because the audit scope has changed. We’ve got more agencies reporting, and they’re also reporting on additional data elements. We worked with the IG community to make improvements to their audit methodologies and the types of procedures that they were performing. And there were also changes introduced by Treasury and the Office of Management and Budget in their guidance and they also updated and correct some issues with their data standards. And then the final change is that there were some pretty significant changes to Treasury’s DATA Act broker that takes in all the information from the agencies and compiles it and then puts it ready for display on USASpending.gov. So those various factors basically keep us from doing a data-element-to- data-element comparison or even a comparison of an agency from 2017 to 2019.

Jared Serbu: Fair enough. So just to speak about this year’s results I mean, there’s quite a bit of variance in data quality, agency to agency, but when I look at it, there’s kind of a band of agencies that are kind of doing okay, that you categorize as high quality. And then there’s low quality and some of the ones in low quality are really quite low. Just to pick on USDA, they came in with a completeness error rate of 61%, a timeliness error rate of 81%, an accuracy error rate of 65%. I don’t know if you can point to what the specific issues are at USDA, but what explains the huge variance from one agency to another? Is part of it methodology or are the differences really that severe?

Paula Rascona: Jared, I think the majority of the errors come about because agency systems still aren’t fully integrated, and not all of the systems that they have, have the ability to report the data elements that are required, so they’re having to go to manual and other types of procedures to get the data in there. USDA, for example, has had problems since 2017 and before. And another big thing that is causing some of the variances is that agencies just have a very difficult time in making sure that the data that’s submitted is of quality. And they don’t necessarily always have really good, strong internal controls to make sure that the data is traceable back to the source system or the source documentation. There are errors that get introduced as they try to pass the data through the Treasury broker. And some agencies don’t always correct all the warning messages that they get. So there’s a variety of reasons as to why these things are happening.

Jared Serbu: In addition to looking at the data accuracy in the aggregate across the agency, that the report also looks at individual data elements and one that caught my eye is you found six different agencies where there was a 40% or higher error rate with the field – period of performance start date for contracts. That’s surprising to me because that seems like such a discrete, knowable, finite piece of data. It’s hard to understand why that could be so inaccurate so often in multiple agencies. Is there a clear explanation for that kind of thing?

Paula Rascona: Jared, part of the explanation may be the fact that agencies are interpreting the requirements or the data definition a little bit differently. And, again, I don’t think that all of the agencies contracting offices that are submitting the data into the different systems where they capture governmentwide contract information – they just don’t have good controls, to make sure that the information they’re submitting is of quality and agrees with their contracting documents. So I think it primarily revolves around having good controls not only just at your financial reporting pieces of your agency, but also within your own program offices that are also reporting nonfinancial data.

Jared Serbu: So GAO’s not making any recommendations on this particular report. But all of the IGs, of course, did. As you look across all of those reports were there common themes, were there certain areas that IGs were more likely to point to and tell their agencies how to fix these problems?

Paula Rascona: Yeah, we actually came up with several areas where we classified the recommendations and they primarily revolved around establish and implementing procedures or guidance; that’s pretty self explanatory, I believe. Developing controls over the submission process, which is really focusing on making sure that procedures for conducting reconciliations and addressing broker-reported errors and warnings are taken care of before the actual closing period of the file submissions, developing controls over data from source systems. This is a big one. The agencies really need to make sure that they get all the issues resolved in making sure that data that’s coming in is in agreement with the source system, be it manual or automated or whatever. And they also need to make sure that they’re working with any of their vendors and grantees to make sure at the program level that the information is verified and updated before it is then submitted into USASpending.gov. Another one is to some of the agencies have developed data quality plans, which are a good thing but they need to be used and they need to be kept current. So those are just some of the basic areas. Of course, there’s always the typical ones, you know, maintaining good documentation and implementing system controls and providing training so they’re pretty much all along the same lines.

Jared Serbu: And kind of a striking thing to me is nothing you just said equates to “Go invest a bunch of money in a new IT modernization program.” These sound like they’re almost all process issues more than they’re technology issues.

Paula Rascona: Ah yes, I would agree. I think that there – a lot of them are process issues. It’s pretty interesting coming from a financial audit background where you see the disciplines of good internal controls established within an agency because they have the annual discipline of going through a financial audit. And over the years, they’ve learned how to establish good controls and make sure those controls operate. On the program side of agencies, they’ve got some work that needs to be done and they also have work that needs to be done in their systems areas, just to make sure that when those systems do talk to each other, there are controls in place to maintain the integrity of the data. Although we are seeing an increase in the number of agencies that are reporting, there are still some pretty significant completeness problems with the completeness of the data that’s being submitted into USASpending.gov. So the IGs are reporting on the quality of the data that they’re testing. But they aren’t really giving a good picture of how much of the information from the agency did not come into USASpending. So I think that’s one of the key things that I think needs to be brought out that doesn’t necessarily hit you when you read the report.

Jared Serbu: Is that even knowable by the IG and a lot of cases, though? I mean, it’s kind of a “you don’t know what you don’t know” problem – you don’t know what you’re missing, isn’t it?

Paula Rascona: Right, yeah no, it’s very, very difficult to quantify. And we’re working with IG community to figure out ways to do that, even in this report. Just to give a sense of magnitude, we talk about the obligations, but the obligations is a number that’s being reported that we could grab onto, but it’s only for a period of time. And it’s only represents funds that are earmarked for spending. I think everybody’s more interested in how much spending is not there versus how much has been earmarked for spending.

Jared Serbu: Paula Rascona is a director on the financial management and assurance team at the Government Accountability Office. We’ll post a link to the report we’ve been discussing at FederalNewsNetwork.com/FederalDrive.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories