Impressive advances in artificial intelligence have already affected the healthcare industry and will undoubtedly continue to revolutionize how patient care is delivered throughout the world. One specific type of AI — Emotion AI — stands out as a particularly promising addition to current mental health and suicide prevention efforts.
Current applications of Emotion AI suggest that this technology might be especially valuable to efforts focused on supporting the mental health needs of veterans and their families. Not only might Emotion AI assist in the delivery of the very best physical healthcare to these deserving veterans and their families, but it might also help us better understand and address the complex emotional challenges of this unique community.
Emotion AI – Innovation consistent with VA’s suicide prevention priority
Suicide prevention has long been a primary area of focus for the Department of Veteran Affairs, which currently has a goal of reducing veteran suicide by 20% by 2025. Thankfully, it appears that VA’s many efforts are reaching those in need. The 2022 National Veteran Suicide Prevention Report found that 6,146 veterans died by suicide in 2020 — a reduction of 343 compared to 2019 data.
Consistent with their suicide prevention priority, the department recently launched an initiative covering emergency care at VA and non-VA facilities for any veteran experiencing an acute suicide crisis. In another recent initiative, VA launched the Mission Daybreak $20M Grand Challenge in the spring of 2022. This competition encouraged innovators to vie for prize money as well as the opportunity to further their solutions designed to reach at-risk veterans both inside and outside the VA system of care.
Given VA’s leadership in the healthcare sector, and the department’s passion for driving innovation that improves care for those who’ve served, the emergence of Emotion AI technology could provide an extraordinary opportunity for the VA to lead the way once again by developing solutions that better serve our veterans as well as the broader society.
Emotion AI: A definition and its benefits
Emotion AI is a specialized form of artificial intelligence that is specifically designed to detect, interpret and provide real-time feedback about our emotions. Emotion AI technology analyzes data based on voice characteristics or facial expressions.
Currently used in the commercial sector by human resource departments to streamline hiring, Emotion AI is also utilized by insurance companies to detect fraud, and by governments to counteract terrorist activity. In addition, research in the healthcare sector is exploring how voice analysis can detect diseases, including respiratory, cardiac and certain neuromuscular diseases.
Finally, Emotion AI has also proven beneficial for customer service call centers, where insights into a caller’s emotional state can help guide customer service representatives – leading to greater customer satisfaction as well as increased sales.
Emotion AI technology could similarly help improve the overall customer service experience of veterans and family members who contact the VA. During Secretary Bob McDonald’s tenure at the VA, the department instituted the “I CARE” initiative, focused on the core values of “Integrity, Commitment, Advocacy, Respect, and Excellence” to ensure the delivery of high-quality veteran care. Real-time feedback provided by Emotion AI technology during customer service calls with veterans could help identify gaps in customer services, which in and of itself would improve overall care.
In addition, Emotion AI technology has the potential to directly support specific VA suicide prevention efforts. One specific area of opportunity could be in support of the VA’s Veterans Crisis Line (VCL).
Launched in 2007, the VCL workforce includes 500 VCL responders nationwide who have responded to more than 6.8 million calls from veterans, active duty service members and their families. VCL responders do incredible work every hour of every day, providing compassion, critical resources and saving lives.
While providing empathy and assistance to each caller, responders must also assess risk factors and recognize indicators that suggest a veteran or service member may be considering self-harm. If that isn’t challenging enough, VCL responders must also manually scan available records and prior assessments to obtain a comprehensive picture of the caller they are trying to help.
Emotion AI technology developed specifically to support this extremely complex and critical work — in a careful, thoughtful and evidence informed way — could significantly lessen the burden on the VCL responder, enhancing their efforts by delivering additional insights and reducing their stress by providing validation for assessments and intervention planning.
Once properly developed, trained and implemented, Emotion AI technology could allow VCL responders to focus — with fewer competing demands — on their primary task of being present for a veteran or service member in distress, allowing the technology to work in conjunction with their efforts. They could more easily integrate additional data from multiple sources about the caller — because the technology would be at work behind the scenes gathering and analyzing data in support of the responder’s assessment.
Finally, in addition to providing critical information to VCL responders about individual callers, Emotion AI could also be used in the initial and ongoing training of the responders themselves. By providing feedback about a responder’s tone, inflection and word choice during interactions with veterans, responders would be better able to adjust, modulate or shift appropriately depending on the emotional style, state or needs of a specific caller.
Generative AI integration into Emotion AI applications
Generative AI, developed though large language models (LLMs) like OpenAI’s GPT-4, possesses the ability to create new content, which can be applied across a range of healthcare applications. With over 100 million users of ChatGPT since its launch in the fall of 2022, the technology has the potential to be integrated into Emotion AI solutions.
However, some experts are calling for a pause in Generative AI development until the underlying processes are both better understood and aligned to human advancement. As such, it is crucial to utilize this new technological breakthrough ethically and responsibly.
To address potential unintended consequences of Generative AI, including Emotion AI, reinforcement learning from human feedback (RLHF) should be employed to fine tune the models for accurate assessment of emotions. Keeping the “human in the loop” at every stage is essential to correct biases and inaccuracies in training data, such as cultural differences in voice, language and tone.
Developers must intentionally minimize biases to ensure fairness and equity for all veterans, regardless of race, disability, gender, age or other factors. Rigorous research and robust focus group testing are necessary to achieve this. Furthermore, Emotion and Generative AI applications may necessitate access to a veteran’s personal information, emphasizing the crucial need for data privacy and confidentiality to prevent any misuse or unintended repercussions.
Finally, relying too heavily on AI technology can have a detrimental effect on healthcare providers’ abilities and proficiency. Accuracy of diagnoses may be compromised if healthcare and mental health professionals become too dependent on information provided by an Emotion AI application without an understanding of the reasoning behind the technology’s assessments.
To harness the power of AI ethically and responsibly, it is essential to be aware of these risks and take preventive steps to address these areas of concern. The ultimate goal of responsible AI, whether it be Emotion AI, Generative AI or other AI applications, should be to optimize the care and support of the veteran community and their families by aligning human and AI technology interactions and implementations.
As AI enters the mainstream across sectors in our society, it presents numerous opportunities to enhance healthcare outcomes, particularly for our veteran community.
Emotion AI and Generative AI are innovative tools that can help in the delivery of mental healthcare and support as well as in suicide prevention, enabling healthcare providers to devote their time to implementing creative solutions and personalized care. However, it’s crucial to approach these technologies thoughtfully and responsibly to assess the risks and mitigate unintended consequences. In doing so, we will safely harness the power of AI to bolster our collective efforts to provide comprehensive care for those who have served our nation.
Dr. Barbara Van Dahlen is former executive director of the PREVENTS Presidential Executive Order Task Force, and strategic advisor to DSS, Inc. Michele Burst is director of strategic innovations, analytics at DSS, Inc. Rob Gordon is chief growth officer at SBG Technology Solutions.