More than a year after the governmentwide Chief Data Officers Council held its first meeting, agencies are developing data analytics capabilities that empower decision-making within their top ranks.
The Treasury Department’s Bureau of the Fiscal Service, for example, is testing out if artificial intelligence can streamline the annual appropriations process and get money to agencies sooner.
When Congress approves spending, the bureau pulls apart the text of the appropriations bill to figure out which agencies and accounts get money – and how much.
To expedite this process, the bureau is testing if an AI algorithm can read the PDFs and turn the text into structured, machine-readable data.
Insight by Carahsoft: This exclusive e-book demonstrates just how far agencies have come and where they still need to go to take fully advantage of DevSecOps to drive modern capabilities to their customers.
Justin Marsico, the bureau’s CDO, said the algorithm uses natural language processing to break down the documents.
“There’s not an easy way that you can train a robot to pull out the right numbers. You need to understand syntax and the structures of the sentences and actual in order to pull the text apart,” Marsico said last month in a virtual summit hosted by the Advanced Technology Academic Research Center (ATARC)
Once the algorithm pulls data from the spending bill, Marsico said the bureau’s subject matter experts review the data to flag and correct issues.
Marsico said the pilot stems from a recognition that some routine office tasks, like physically signing documents, can come with added “friction” with employees working during the COVID-19 pandemic.
A governmentwide demand for transparency in data-driven decision-making, he added, has also accelerated during the pandemic.
The bureau stood up a dashboard early in the pandemic to help executives decide when to reopen office buildings, but Marsico said Commissioner Tim Gribben made the dashboard available to all bureau employees, and has helped the workforce develop trust in reopening decisions.
“Staff understood how we were making decisions, and that we were putting ourselves out there, and the criteria that we were using out there, so that we could be accountable to the staff for actually making those decisions,” Marsico said.
Since Congress passed the Foundations for Evidence-Based Policymaking Act in 2019, Kshemendra Paul, the CDO of the Department of Veterans Affairs, said he’s seen a “renaissance” in agencies using data as a strategic asset.
VA in December issued its capstone Data Management Directive, which sets enterprise-wide policies for enabling the management of VA data as a strategic asset in a consistent, accurate, and holistic manner.
Meanwhile, Paul said the VA is in the final stages of drafting its data strategy, and has elevated its data governance council by bringing in key leaders from across the agency.
“The vision we’re driving toward is using data … to support and strengthen VA’s journey as a learning enterprise – both continuous improvement through operational decision support at every echelon and evidence-based policymaking to support us – to continuously improve how we serve veterans, their families, caregivers and their survivors,” Paul said in a FedInsider webinar last month.
To strengthen these efforts, the VA is working with the Defense Department’s CDO on what Paul described as a “joint vision” for data and analytics. Through these data insights, Paul said the VA is looking to improve its customer experience to veterans as they phase out of active military service.
“The key idea there is to recognize and build on the customer experience journey-mapping that VA has led on, and push that back into the service members’ journey, so we have an integrated view of the service member veteran’s journey – understanding the moments that matter [and] what’s the authoritative data we want to put to use securely to support those moments that matter.”