Countdown to shutdown:

VA puts AI use cases into ‘operational phase’ to meet its health care mission

VA has identified over 100 AI use cases so far. VA Chief Technology Officer Charles Worthington told members of the House VA Committee last week that 40 of them are...

The Department of Veterans Affairs (VA) sees artificial intelligence tools as a way to provide a higher level of care to veterans, while reducing administrative tasks and burnout among its employees.

VA has identified over 100 AI use cases so far. VA Chief Technology Officer Charles Worthington told members of the House VA Committee last week that 40 of them are in an “operational phase,” and being put to use in the field.

“The department believes that AI represents a generational shift in how our computer systems will work, and what they will be capable of. If used well, AI has the potential to empower VA employees to provide better health care, faster benefits decisions, and more secure systems,” Worthington told the health subcommittee on Feb. 15.

AI use cases put into practice include an AI model called REACH-VET, designed to predict the veterans who are most at risk of suicide. Worthington said this information impacts what VA providers prescribe when seeing at-risk patients, and how they follow up with them.

Worthington said the VA also used natural language processing (NLP) as “customer experience listening,” to comb through feedback and comments submitted by patients.

“Occasionally, those comments will indicate that this veteran might be at risk or need help. Maybe they are indicating that they are having homeless problems,”  Worthington said.  “So this NLP model can flag comments that might be particularly concerning for follow-up by a professional that can read the comment themselves and decide if some other action is warranted.”

Assistant Under Secretary for Health Carolyn Clancy said the VA is developing AI predictive tools to identify which veterans are likely to do well after initial treatment for prostate cancer, and which are likely to need more frequent monitoring.

Clancy said the VA is particularly focused on “augmented intelligence” use cases of AI that improve the productivity and effectiveness of VA clinicians.

“In other words, the human in the loop is quite important,” Clancy said. She added that the VA is trying “to balance benefits while being very, very attentive to risks.”

Worthington said the VA’s Trustworthy AI Framework, adopted in July 2023, gave the VA “a critical head start on developing policies to govern our use of AI in production.”

Worthington said VA expects to update its strategy in the coming year to reflect the pace of this emerging technology.

“Over the past several years, VA has created the foundational guardrails it needs when considering AI tools have a significant potential to improve veteran health care and benefits,” Worthington said.

Clancy told lawmakers that the VA is in the “middle of the pack, or possibly even further up than that,” when it comes to AI adoption in U.S. health care.

“No system yet has put out in public or has figured out how to take all these steps in a very, very careful way — to balance benefits while being very, very attentive to risks and so forth,” she said. “I think there’s a fair amount of caution all around. But I would expect, by virtue of our size that, in many ways, we may actually be in the lead, which would be a good place to be.”

VA in October launched an AI Tech Sprint, which focuses on how VA can address provider burnout by providing  AI dictation tools to take notes during medical appointments. It also looks at how AI can extract information from paper medical records.

“By investing in these projects, VA aims to learn how AI technologies could assist VA clinical staff in delivering better health care with less clerical work, enabling more meaningful interactions between clinicians and veterans,” Worthington said.

The VA launched its AI Tech Sprint as part of the AI executive order President Joe Biden signed last October, calling federal agencies to step up their use of this emerging technology.

Subcommittee Chairwoman Mariannette Miller-Meeks (R-Iowa) said AI creates possibilities to improve diagnostic accuracy, predict and mitigate patient risk and identify interventions earlier — as well as the administrative burden on employees.

“While AI holds great promise, the reality is that it is a new, developing technology, and we are still figuring out what is possible and practical and ethical,” Miller-Meeks said.

Subcommittee Ranking Member Julia Brownley (D-Calif.) said the VA, as the largest health care provider in the country, will serve as a model for the implementation of AI at other health care systems, “which makes it all the more important that we ensure VA and other AI users establish best practices, procedures, and guard rails early on in the implementation.”

“Even as we find productive ways for AI to be implemented, we must take measures to ensure VA is continuing to robustly hire, retain — and I will emphasize retain — and protect its clinical workforce,” Brownley said.

David Newman-Toker, director of the Armstrong Center for Diagnostic Excellence at Johns Hopkins University, said VA’s health care data environment “is better suited than most” to delivering high-quality data that might train AI systems.

“The rate-limiting step for developing and implementing AI systems in health care is no longer the technology,” he said. “It is the sources of data on which the technology must be trained,” Newman-Toker said.

Newman-Toker said the VA’s data advantage includes a “commitment to health care quality and safety,”  a unified health record offering greater potential for standardizing data capture and a patient population that tends to stay within the VA system.

“These attributes give the VA the opportunity to take a leading role in building high-quality AI systems,” he said.

Meanwhile, Technology Modernization Subcommittee Chairman Matt Rosendale (R-Mont.), however, is urging the VA to notify veterans when their health or personal information is fed into an AI model.

“The problem that I see is that you are literally putting the cart before the horse,” Rosendale said. “You are utilizing AI and you are not disclosing it to the veterans. You are not giving them a choice. And that is dangerous.”

Worthington said that at the VA, “protecting veterans’ data is pretty much job one, especially in the Office of Information and Technology.”

“I think we are lucky that we have a lot of existing policies around how veterans’ data can be used and how it can’t be used,” he said.

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories