Best listening experience is on Chrome, Firefox or Safari. Subscribe to Federal Drive’s daily audio interviews on Apple Podcasts or PodcastOne.
It’s one thing to launch programs, but it helps if agencies can prove they work.
The Foundations for Evidence-Based Policymaking Act, which President Donald Trump signed in January, requires agencies to appoint chief data and evaluation officers, and better leverage their data in decision making.
But in order for agencies to get a better understanding of what’s expected from them, the Office of Management and Budget soon expects to release guidance for the law.
Chief Statistician Nancy Potok said the Evidence Act guidance has reached the “very last stages” of the clearance process, and gave an overview of what chief financial officers and public information officers should expect from rolling out the legislation.
“One of the things that the Evidence Act really emphasizes throughout is the use of really high-quality information to make decisions in agencies. So that means finding the information, making sure it’s fit for the use you want to put it to, and then actually being able to use it to feed into the decision-making process,” Potok said Tuesday at an Association of Government Accountants (AGA) conference.
Beyond just adding new C-suite leadership to these agencies, Potok said OMB expects agencies to build a more collaborative workplace. But in order to do that, the agency is taking a hands-off approach in their guidance.
“We’re not drawing your org chart for you. In other words, we’re just telling you what we want the outcomes to be, and the types of people we expect to be in there in these jobs and what their jobs actually are,” she said. “Let the agency figure that out based on your own organizational structures right now.”
The Evidence Act piggybacks on a goal in the President’s Management Agenda of using agency data as a strategic asset. Potok said agencies can expect to get funding up-front to get their data in better shape, but OMB will also expect agencies to rely on their performance data to justify future budget requests.
“We don’t like unfunded mandates. Yes, you have to prioritize, but we’re realistic, and we know you need the resources,” she said. “But on the other side, we want to actually understand and see that you’re using the data to improve your programs and to get more efficient in your operations.
But it’s not just the PMA. The Trump administration also expects to soon roll out a federal data strategy. Trey Bradley, program manager for strategic data initiatives at GSA’s Office of Governmentwide Policy and part of the team behind the data strategy, said a first-year action plan for the strategy would get released in August.
But for all the data-driven insights the Evidence Act looks to bring to government data, there’s a couple of workforce hurdles agencies will have to navigate.
Michael Conlin, chief data officer at the Defense Department, said it’s difficult to recruit data scientists to work in government because there isn’t a one-to-one match of their skills to a government job series.
“Here’s my challenge: Data scientists have to be computer scientists; they have to be able to write Python or they have to be able to handle the infrastructure; they have to do some of the data-wrangling; they have to also know the math and the statistics intimately; then they have to have domain subject-matter expertise. Now you know why they’re so rare,” Conlin said. “But I’ve got to shoehorn them into an IT position description, and they send me, from recruiting, resumes of people who got a bachelor’s degree in computer science 35 years ago. That’s of no value whatsoever to what I’m trying to do.”
Balancing access with security
Chief data officers will oversee an inventory of data their agencies and determine how best to share that data out to the public, while exempting sensitive and confidential information for public disclosure.
Nick Hart, chief executive officer of the Data Coalition and a former OMB official, said the Evidence Act could play a significant role in knocking down some of the data siloes in government.
“If we’re interested in reducing the burden on the American public for data collection, or just having good information to make decisions, we have to acknowledge that sometimes data collected by one agency can be useful for another, and that is the problem that we are ultimately going to have to solve for in the 21st century,” Hart said.
But DoD’s Conlin, through an example he dubbed the “Starbucks rule,” explained how difficult it is to balance access with security.
“How much data can I rake into a pile and run analytics over in a public cloud, for the price of a large latte? The answer to that is every damn thing my department has ever put in the public domain … and I can run very sophisticated algorithms over that. Now the minute I do that, I’ve created classified information, so I wouldn’t do that in a public cloud. But that doesn’t mean our adversaries can’t do that. So there’s real tension between this desire to show that we’re getting the taxpayer good value for their money, and the desire not to expose ourselves inadvertently to our adversaries,” Conlin said.
Striking that balance will be a heavier lift at some agencies compared to others, but Hart said statistical agencies, like the Census Bureau, have longstanding best practices around data.
The bureau, for example, has Federal Statistical Research Data Centers, where academics can get permission to access confidential records. But in turn, Hart said those researchers can fill in some of the gaps in the census data.
“The more we make that information available for researchers, we also improve our ability to fill in those gaps. It’s a way of improving data quality over time,” Hart said. “So I don’t want us to divorce the concepts of how data quality, data accessibility and data use are so intricately connected, and we need them to be in order for this operation to actually work.