Buzz about artificial intelligence has led to increased spending and put several Trump administration directives in motion, but only a handful of agencies have gotten into the early stages of AI adoption. However, a second wave of agencies may soon launch their own AI tools if they can overcome some common hurdles.
The Professional Services Council Foundation, in a report released Wednesday, highlighted some of the challenges and opportunities agencies face in using AI to deliver on their mission.
Looking across four agencies — Defense Department, the General Services Administration, NASA and the Department of Health and Human Services — the report highlights use cases where program offices have pioneered AI to reduce backlogs or increase the output of their existing workforce.
“They’ve turned to AI to say, ‘Are there routine decisions that we make on a regular basis that AI is now competent enough to handle in a way that we can delegate those decision processes to?’” Dominic Delmolino, the chief technology officer at Accenture Federal Services, said Wednesday at a briefing with reporters.
Alan Chvotkin, the executive vice president and counsel at PSC, said contract obligations and AI-related investments grew almost 75% to nearly $700 million between fiscal 2016 and 2018.
“I think when we get to the end of fiscal 2019, and get a chance to look back over all the spend data that we’re able to find, we’ll see this well over $1 billion,” Chvotkin said.
The Defense Department, according to the report, spends more on AI than any other agency, followed by NASA and HHS.
But the list of AI programs in government may soon expand. The report lists the departments of Agriculture and Veterans Affairs, as well as the National Institute of Standards and Technology and the Patent and Trademark Office, as agencies that are “exploring many other potential applications for AI.”
USDA and VA have specifically looked at ways to use AI to improve their service to customers. USDA has considered using AI-powered chatbots to assist its call center workforce, while the VA is looking at AI to connect veterans to the agency services they’re looking for.
NIST, according to the report, has looked at using AI to help with its R&D. USPTO wants to use AI to detect whether new patent applications overlap on any of the more than 10 million patents the agency has granted.
Building a data-centric culture for AI adoption
But for all the excitement around AI, the report points to four challenges agencies and the administration will need to overcome for more widespread adoption.
First, agencies need to develop a business case for AI, not develop a solution in search of a problem.
“We’d like to at least make sure, from industry, that the government gets beyond the belief that AI is magic. It’s not — it’s something that’s evidence-based that uses data,” Delmolino said.
The second and third challenges center focus on building a data analytics culture at agencies, and developing AI competency within the federal workforce.
“If you’re not used to working in these statistical areas, then all of a sudden putting a spotlight on your processes can be a little unnerving,” Chvotkin said. “So you’ve got to have people with the competency around AI … and then a culture that supports the analytics associated with it.”
David Berteau, president and CEO of the Professional Services Council, noted that many agencies also struggle with data ownership and granting access to data elsewhere in their agency or to other agencies.
“My experience in the government is you spend way more time fighting over whose data is correct and whose you’re going to use to make decisions, than you are actually addressing the real issue and the decisions,” Berteau said.
But under the Foundations for Evidence-Based Policymaking Act, which President Donald Trump signed into law in January, agencies must appoint chief data officers and chief evaluation officers before the end of July.
Delmolino said statistical agencies and agencies that already have chief data officers remain a step ahead of their colleagues in building a data foundation suitable for AI applications.
“If you’re thinking of having AI as an enabler across your agency, you need a CDO to understand what data might be relevant and shareable,” he said.
While the Evidence Act will soon require agencies to appoint CDOs, the Office of Management and Budget has already signaled that its upcoming guidance will allow agencies to figure out for themselves where CDOs will fit into their org chart.
Berteau said it’ll take more than just agency leadership to build a data-centric culture to power AI.
“It’s more important to have the function and the responsibility, and it’s really more than just that person. It’s got to be the leaders of all the programs that have to recognize it. That’s part of their job and part of their value as well,” he said.
Open questions on AI ethics framework
As a final hurdle, the report also calls for a unified ethics framework for AI that unpacks difficult questions about AI transparency and accountability.
Lynne Parker, the assistant director for artificial intelligence at the White House’s Office of Science and Technology Policy, said an updated national AI R&D strategy, expected to be released later this spring, would address some of these issues.
While Google drew attention last year for its decision not to renew its contract for DoD’s Project Maven, which uses AI to analyze drone footage, the military has also found success with using AI tools for predictive maintenance on Bradley Fighting Vehicles and aircraft.
But civilian agencies have relied on AI — especially robotic process automation — to crunch numbers within their procurement operations.
GSA’s Federal Acquisition Service, for example, has used RPA to save its employees from having to waste time on administrative “cutting and pasting” tasks. And HHS, through its BuySmarter initiative, has used AI to scrub spending data to consolidate contracting across some commonly purchased items — like medical gloves and pipettes — and get them for the lowest price.
“I think those are the areas that there is a high degree of comfort, and don’t come anywhere near any questions about ethical use, and I think that’s where agencies feel very comfortable,” Chvotkin said.
“It’s only in a limited number of cases where there’s real questions being raised about ethics in its broadest context.”
The Trump administration’s FY 2020 budget request largely spared AI funding, but proposed slashing overall funding for federal R&D. However, the drawn-out nature of the congressional budget process may complicate efforts to prepare AI.
For the FY 2019 enacted levels, some agencies have spent more than two years or longer to put together their budget requests and requirements. But given the rapid developments of AI in recent years, the tools agencies request may be outdated by the time they get the funding to purchase them.
“The world of AI is just an example where the evolution of the available technology is at a far faster rate than the cycle time of getting dollars in the budget … that makes it doubly important for a process inside the government that focuses on results rather than defining what you want to buy two years before it exists,” Berteau said.
Copyright
© 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.