"It's essentially an opportunity to to practice skills that they may not have many chances to use on a regular basis," Dr. David Riggs said.
Suicide is a persistent problem for the armed forced and among veterans. Prevention attracts some of the best minds in medicine and academia. Researchers at the Uniformed Services University of the Health Sciences, USU, have been at the forefront by developing an online game designed to help clinicians who deal with service members at risk of suicide. Joining the Federal Drive with Tom Temin to explain, professor and director of the Center for Deployment Psychology at the USU, David Riggs.
Interview transcript:
Tom Temin Dr. Riggs, good to have you with us.
David Riggs Thank you.
Tom Temin And some folks there at the Deployment Psychology Center, what is it they have built here that helps in the suicide battle?
David Riggs So one of the struggles that we’ve found in training and educating providers to care for folks who are considering death by suicide is that we can present them with the information and show them the skills, but they don’t have many opportunities to practice outside of the office where they’re dealing with somebody who at the time might be suicidal. And so what we’ve developed is an online tool using kind of gamification technology and on a platform called Second Life to allow them to interact with avatars, to practice the skills that we’ve taught them in the workshops. So it’s essentially an opportunity to to practice skills that they may not have many chances to use on a regular basis. I mean, although suicide is a major problem for the average clinician, we don’t see, you know, more than a couple individuals who are at risk in a year. And so this gives them a chance to practice those skills.
Tom Temin So in a sense, you’ve built an environment in which they can interact with synthetic patients.
David Riggs Yeah. They’re not set up as patients. Exactly. We’ve tried to set them up in a way that allows the clinician to practice a set of skills that can then be applied in in a clinical setting with patients. I make that distinction because if a clinician goes to the site, they’re not going to find themselves sitting in an office talking one on one with somebody sitting in a chair. We’ve, in fact set it up kind of like a national park. So there are outdoor scenes. And so as an example, in identifying factors that might place somebody at risk for suicide, we have them walking a trail through the woods, identifying risk factors along the way, interacting with this with this avatar with a synthetic patient.
Tom Temin Well, let me ask this. I imagine that in actual situations, a person dealing with people, psychiatrists, psychologists, other types of clinicians might deal with two basic types. One, someone who is actively or say verbally suicidal, and then they’re brought in for help, or someone that just has whatever other types of psychological symptoms that would prompt them to seek psychological help. But maybe they don’t know that they’re suicidal or could be at risk at becoming suicidal. Those are two very different situations. Can this help with that distinction?
David Riggs It can in that the set of skills that we’re teaching more broadly and that are illustrated in this and practiced with this tool run the gamut from, as I mentioned earlier, risk evaluation, where you might have somebody who’s not coming in saying, yes, I’m feeling suicidal, but there are a number of factors in their life that point to at least a risk for, you know, not obviously nothing, for good or for bad. Nothing in psychology is guaranteed. So we tend to look at probabilities. And so we start with that. But then we also include skills like lethal means counseling, where talking to folks about aspects of their life that may be made more safe, for example, securing firearms or medications that might be potential for overdose. Getting them secured so that they don’t have ready access to them. So those sorts of interventions would be more applicable to somebody where either we’ve identified that risk is there or they’ve come in expressing that risk. So we try and run the whole gamut from from an assessment all the way through interventions that might not only help reduce the risk of suicide at the moment, but interventions that might reduce the risk of suicide in the future.
Tom Temin We speaking with Dr. David Riggs. He’s director of the Center for Deployment Psychology at the Uniformed Services University of the Health Sciences. And let’s talk about maybe the programmatic aspects of this. Is it available online throughout the defense medical system and also to Veterans Affairs who can access this?
David Riggs So the SLIPS site launches later this evening. Probably if folks are interested, it should go to our website, which is deploymentpsych.org. And there’ll be a link to the site where the tool is located, where it’s available to anybody, whether they are in the Department of Defense, the Department of Veterans Affairs or civilians out in the community. Because one of the things we know is that many veterans and service members and certainly their family members are seeking care in the community, not through DOD or VA provider. And so we want to make sure that it’s available as broadly as possible. And that’s true for many of the center’s training workshops. So we make them available not only to folks inside government service, but also civilians who may be caring for our service members and families. So this this website or this tool will be available to the general public. We do say it’s directed to clinicians. It’s not designed for, you know, somebody who themselves is feeling at risk. It’s not designed for even family members of those who are feeling at risk. Just wanted to be clear that it’s designed for the practitioners to learn the skills.
Tom Temin Would this be useful for, say, those that volunteer to answer the VA’s suicide line or the 988 line?
David Riggs I think that the basic information that’s provided will be useful for them. I think that it’s, in an ideal world, they would not be the ones who would be applying these skills. Now that we don’t live in an ideal world and they may find themselves in a position where they have to do that, but that’s not what it’s designed for.
Tom Temin Got it. And this program, it’s called SLIP, I believe?
David Riggs SLIPS. It’s the acronym is it’s built in on a platform called Second Life. So it’s Second Life Island for Preventing Suicide Slips.
Tom Temin And it’s award winning already even before it deploys.
David Riggs Yes, we’ve been able to show it to. We designed it originally as part of a research study where we were looking at the impact of providing this in addition to our workshops. I will say offhand that the research study struggled to get the participants, so we can’t draw any conclusions from what we found, but people did find it useful when they engaged with it. But as a result of that, we were able to share that at a professional conference and it was award winning. Yes, we’re proud of it.
Tom Temin And as a programing matter, is there some artificial intelligence in this that is to say, do we do the users interact with response generating avatars that maybe they know?
David Riggs So, yes, their response generating avatars. But no, there’s no intelligence from us that the programing was done before the advent of a lot of the large language models. So we were not able to build those. And as part of that. So the the responses that the avatars generate are rigid and like they’re prerecorded basically, that it does offer a branching logic in some places where depending on what the clinician enters in as their response, you’ll get different responses from the avatar, but it’s not as flexible as something that was AI-generated.
Tom Temin And I imagine an AI-generated would be a two-edged sword. It could improve over time and get better trained with more data. But on the other hand, those things can go way askew and maybe do more harm than good.
David Riggs They can. We have actually had some pretty interesting conversations in the group about the possibilities of incorporating AI into this tool or similar tools. And one of the questions that we always bump up against is how do you put the guardrails in so that it doesn’t spin off in a direction you don’t want it to take? And there are ways that people are looking at to kind of do that with, hey, I’m far from an expert in that area, but I think that it is probably something that we’ll be looking at incorporating in the future into a number of the tools that we develop.
Tom Temin Dr. David Riggs is director of the Center for Deployment Psychology at the Uniformed Services University of the Health Sciences. That’s in Bethesda, Maryland. Thanks so much for joining me.
David Riggs Well, thanks for talking to me and thanks for the interest in the program.
Tom Temin And we’ll post this interview along with a link to more information about the SLIPS game at federalnewsnetworks.com/federaldrive. Subscribe to the Federal Drive wherever you get your podcasts.
Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.
Tom Temin is host of the Federal Drive and has been providing insight on federal technology and management issues for more than 30 years.
Follow @tteminWFED