Provided by Tyler Technologies

3 tactics to make your agency’s data more actionable

As agencies work to implement the Evidence Act and use data to improve mission, they also should take advantage of data investments already made, suggests Tyler...

There’s no question of the value of data in helping get to better decisions. But given the wealth of data in the government, how best to derive usefulness from data has become a central theme as agencies, four years in, continue to implement the mandates of the Evidence Act.

The law, which goes in full by the Foundations for Evidence-Based Policymaking Act, codified the importance of data. Led by the Federal Council of Chief Data Officers, agencies have been hard at work collectively expanding data literacy and effective use across the federal government.

They are building on the “learning agendas” required by the law and created by agency CDOs. The agendas detail how each agency intends to use data to identify and address strategic mission-related questions. In other words, agencies have gone from asking “How much funding did we spend?” to “How are we spending our funding, and what does that mean for X project?”

But here’s the rub: The move to gather and analyze data to understand impact requires that agencies balance their efforts at achieving the promise of the Evidence Act against the backdrop of their existing data infrastructures and mission requirements, said Michael Donofrio, senior advisor for federal solutions at Tyler Technologies.

Striking a balance between old and new technology investments

How can an agency amplify its data investments and continue to improve its overall infrastructure? “A lot has been done already by agencies because of the value of data and importance of evidence,” Donofrio said. “What agencies are struggling with is trying to figure out how best to complement what’s already been done so they don’t recreate work.”

Donofrio suggested agencies take a simplified, more expedient path to data-driven decision-making and policymaking. “So that you can get the most out of any existing investments while expanding your capacity to put data to use as you evolve your infrastructure,” he said.

We asked Donofrio to share best practices for making that happen. He offered three processes that can help an agency get the most from data now and going forward: discovery, access and collaboration.

Process 1: Enable data discovery

Start by putting data in a catalog-like format, so users can find it using metadata, just like they would a product on Amazon, Donofrio said.

As an example, he pointed to work that Tyler Technologies did with the Centers for Disease Control and Prevention to leverage data to combat the COVID-19 pandemic.

“During the onset of COVID, the availability, accessibility and scalability to disseminate authoritative data on demand was absolutely needed,” he said. “When federal COVID data was starting to come out, our state and local clients — our state chief data officers and our state health information officers — were asking us, ‘Can you help us get to the right people because I’d like to see this data?’ or ‘Can we get this information too because we’re missing this?’ As a private entity, we were able to connect communities of users to help find the data. Then, it became about getting it in a format that was not just consistent but useful.”

Process 2: Provide easy access to the data

Once an agency’s data users, whether internal or external, can find the data they seek, they need to be able to access it. That requires interoperability, Donofrio said. Rather than get everyone to change the tools they know and use, make the data available in a format that all current tools can make better use of.

In many cases, agencies have numerous systems, each with their own reporting tools. In some cases, there are data warehouses where the data sets are consolidated, sometimes across multiple agencies. But more and more, people just want flat data, so they can build their own tool because analytics aren’t the endgame anymore, he said.

“What’s important is the infrastructure that connects that data ecosystem together and ultimately grants access to the people who need it,” Donofrio said.

Process 3: Cultivate collaboration using the data

Collaboration involves taking the insights that tools offer and then pulling information from the data and combining it with historical knowledge, he said. That way, agency teams can create narratives supported by the data, Donofrio said.

“One of the great experiences that I’m seeing is the cohesive nature of collaboration between the CDO community and program offices,” he said. “Instead of resolving data to address 90 different needs, agencies are doing it four or five times and then leveraging best practices to replicate the data pull across 90 data sets.”

As a follow-up to its work with CDC, Tyler Technologies helped the Health and Human Services Department modernize its open data platform, HHS expanded for use by all federal healthcare components.

“Not only are they improving discovery and access to authoritative health data but also disseminating consumable information products that small, underserved communities can use to better understand the data — that they can use to drive economic value, to drive research, to drive grants,” Donofrio said.

It’s important to remember that while a dashboard might serve as the touchpoint of the analytical experience, it’s not the overall goal of the data experience or the Evidence Act, he added. “The end goal is for agencies to act on their data, to build on their analytics and insights, and to continue to evolve their use of data to make a bigger mission impact.”

To discover more digital transformation insights and tactics, visit The Evidence Act: Actionable Insights With Data series.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories