VA found a fast solution to its growing call center wait-time problem

VA's contact centers turned to a chatbot after experiencing a surge in calls starting in March that led to long wait times and frustrated veterans.

When the call centers at the Department of Veterans Affairs experienced a surge at the beginning of the pandemic, the agency didn’t try to hire and train more people while the wait times for answers grew.

Instead, VA took a page from the private sector and implemented a chatbot in a matter of weeks.

Dr. Kaeli Yuen, a Presidential Innovation Fellow in the Office of the Chief Technology Officer at VA, said starting in March the contact centers started to experience the surge in calls and that created frustration among veterans for how long they had to wait to ask a question.

At the same time, VA, like most companies or agencies, started to worry about having employees in the office and possibly contracting the coronavirus.

Dr. Kaeli Yuen is a Presidential Innovation Fellow in the Office of the Chief Technology Officer at the Veterans Affairs Department.

“This was the problem we were trying to solve in standing up the VA coronavirus chatbot. The purpose of the chatbot is to more quickly serve veterans with information about how the coronavirus is impacting their VA benefits and services,” Yuen said on Ask the CIO. “It launched toward the end of April.”

She said the bot helped answer some of the most basic questions like whether a VA facility was open or what were the guidelines for going to the hospital, and how to change an in-person appointment to a telehealth appointment.

“So far there has been pretty promising results of engagement. [As of early July] we had over 53,000 user sessions on the chatbot,” she said.

Chris Murphy, the CEO for North America at ThoughtWorks, said a chatbot is a software code or a script that interacts with users in a “human-like” way to help solve problems.

He said ThoughtWorks and VA chose one of the many existing chatbots that currently exist and are in use across multiple industries.

“The real question for us was which one would best serve the objectives and the needs we had to serve the veterans, and in regards to the urgency of the timeframe,” Murphy said. “We were able to quickly settle on a particular chatbot framework from Microsoft. It’s really solid, met our needs and Microsoft already had an established relationship with VA so that helped as well.”

Murphy said it took about four weeks from the time VA said go to the time the ThoughtWorks put the chatbot in production.

Yuen said during that development period, VA spent a lot of time testing the chatbot out with veterans to ensure it was meeting their needs.

“Before the launch of the chatbot we did a number of user testing sessions with real veterans to get their attitude toward a chatbot, toward interacting with something called a chatbot toward getting their questions answered in this way,” she said. “We actually received positive feedback about it from the sessions we conducted so that was a bit of an encouraging sign that this would actually be a viable way to get information to veterans.”

Urgency and importance drove solution

After the launch, Yuen said VA continues to collect data by asking a number of questions after sessions, including whether the user wants to talk to a person about their specific needs.

“The percentage of users who select that is fairly low. To date, it’s about 6.4% of user sessions ask to talk to someone about their specific needs,” she said. “I was pleasantly surprised with how smoothly everything came together for the chatbot. A lot of that was owing to the urgency and importance of the situation.”

Murphy added VA’s approach was not to look at solving the problem through technology, but fixing the short term challenges of serving veterans.

“In any of these types of environments, this is a very sensitive bot in that it’s serving information that people will be making decisions on. So as we go through this, we have to ensure we are getting something out there that is getting the right import and getting the right approvals and still meeting the timeframe,” he said. “From a purely technical perspective, a lot of software development is very automated. This type of low-code framework didn’t necessarily support all those approaches so we had to do it quickly and we had to a lot more manual testing to ensure that everything was working well and we were putting the best solution out there.”

Yuen said VA is re-evaluating the chatbot every three-to-six months to make sure it’s meeting the agency’s needs.

She said one big lessons learned is to build off existing technology and not trying to develop something from scratch.

“A simple technology sometimes is the best choice to meet users’ needs at the time,” Yuen said. “It doesn’t have to be the fanciest, most advanced natural language understanding type of tool. I think that is a good lesson learned here because that often is something that is attractive.”

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories