What can be done in the public-health sector to combat misinformation?

A new study looked into how well health misinformation is researched by these institutions and the methods they used to fight it. One of the study's authors is ...

You most likely have seen it over the past couple of years. Your old friend from high school sharing that link that shows a new secret way to prevent COVID-19 that ended up being an ad for modern day snake oil. Yes health misinformation exploded over the course of the pandemic, overwhelming governments, public health authorities, and social media platforms looking to combat it. A new study looked into how well health misinformation is researched by these institutions and the methods they used to fight it. One of the study’s authors is Stefanie Friedhoff, professor of the practice at Brown University’s School of Public Health. She told Federal News Network’s Eric White what they found on the  Federal Drive with Tom Temin.

Interview Transcript: 

Stefanie Friedhoff So a while ago, we started looking at misinformation, COVID- 9 misinformation research to try to understand what lessons we could learn from all the interventions that were applied during the pandemic, and those that were actually studied so we could see what worked and what didn’t. We wanted to create an evidence based playbook for practitioners to understand how to deal with health misinformation. We found around 50 studies that researched this with real participants. We excluded things like studies that were modeling this because we really wanted to see the impact on people in those types of situations. And we learned that out of the 50 studies that we found, some misinformation remains a leading challenge, especially health misinformation. And we thought we should look at the research that’s available, especially studies that were done during the COVID-19 pandemic, when a lot of interventions were fielded, a new things were tried to understand what can we learn from them, How can we get better at responding to health misinformation and how do we know what works and what doesn’t?

Eric White Gotcha. So what types of intervention methods did you find that most public health authorities were using, and were any of them even effective?

Stefanie Friedhoff So to our surprise, we found that a lot of these papers used such different outcome measures and such different ways to look at the challenges that by and large, we couldn’t really compare the evidence. There were 50 studies which used like 47 different ways of looking at the pie. We looked at, for example, the misinformation that the studies put in front of people, and it was very different. Some studies looked at misinformation such as gargling with salt water will prevent you from getting COVID or curing COVID and others shared full blown conspiracy theories. So it was hard to compare that evidence. We did find some evidence for interventions that are called debunks. So after an item has been put out, that is clearly wrong, efforts to debunk that information in certain context will make people less likely to believe in the accuracy of that content. And then also accuracy prompts or nudges the ways in which people are asked to assess a piece of information, and after that are showed different types of misinformation. And the studies show that people are then less likely to believe in the accuracy of such content when they’ve been prompted previously to think critically about content.

Eric White Yeah. Is there a big push from a lot of health agencies in dealing with this problem? Because you don’t get into the medical field usually thinking about, oh, this is how I have to convince people not to listen to things that are definitely not good for them.

Stefanie Friedhoff Well, one key outcome of our study was that only 18% of the papers we could find on this topic actually measured any public health related outcomes, such as intent to vaccinate or self-reported mask wearing or intend to pay for an unproven treatment. If we don’t study the impact on public health, then we also won’t know what works in public health. What is clear is that not enough people from the public health world, both experts and practitioners, are included in the design of these types of studies. Responding to health misinformation has become a major part of working in public health. People were not trained for it. It is really a crisis that exploded during the COVID-19 pandemic. And people had to learn on the fly. There have been reports of Walker burnout because of this. There’s a lot of efforts going on to try to increase capacity to try to help people navigate these types of situations. It’s been particularly challenging when you are a local public health practitioner and you go to hearings and meetings as you should. You want to be in community. And emotions are high around these types of issues. But there’s a large and growing need to both support our public health practitioners as they take on this extra additional challenge that is really hard. And also for everybody who works in public health to play a role in cleaning up our information spaces.

Eric White Yeah. Can we discuss the big part of this, which is just the advent of social media? Back in the old days, all they really had to combat was ads in trademark publications and things like that. But now you’ve just got so many voices that are into the realm of public health. What affected the social media platforms have on these studies that you looked at?

Stefanie Friedhoff The challenge with social media platforms is that many of them don’t share their data. So we need to understand that a lot of these studies were experiments with people as opposed to watching what works and what doesn’t or what is playing out on social media in real life, in real time. We don’t have good enough data often to answer those types of questions. It is very clear that the world has changed dramatically from when we had a few very curated information sources to this wide world where everybody has a voice. And in general, we all know that’s a good thing. The social media companies now have a responsibility as this is common infrastructure. This is the information space as a public good that we share. And we’re really in the early stages of adapting to this technological change and trying to find ways to understand it, understand the harms regulated. As we know, there’s a lot of conversations and a lot of countries going on about how to best do this. But social media is the new place where most people get their information. We now have a generation that doesn’t Google necessarily to find information, and they search on TikToks and other platforms that they use. And those are important changes that we need to understand when we try to meet people’s information needs, when we try to get good information to where people are actually making sense of things in the world.

Eric White And that provides a nice segway into what is the main takeaway, which is that as the avenues for misinformation get more diverse, the ways to combat misinformation also have to get more diverse, and so do the studies that look into misinformation. So what does that mean by making the studies more diverse? What did you all have in mind with that?

Stefanie Friedhoff Well, our study also looked at what types of delivery mechanisms did the different studies use to share the misinformation. Was it text only? Was a text and a picture? Was it audio? Was it video? And what you learn when you look at that is that we’re mostly currently studying text and maybe image based misinformation, but we’re not studying video based misinformation. Only 6% of all the interventions that we looked at used video formats at all. Given the rise in the prominence of video and the increasing amount of video based misinformation that’s out there,  there’s a real need to improve our ways to look at this. More broadly speaking, we really don’t have good science right now to understand how misinformation truly impacts people. You could look at this the same way we look at a novel disease when it first comes out. It’s there and it’s creating some impact, but you don’t know exactly for whom and how and in which ways. And that is where our research needs to become much more granular and become much more elaborate. We need to invest in this type of research. We also currently have a gross underinvestment in this type of research. Doing this work will really also help us overcome some of the politicization because people are worried about censorship and who decides what is misinformation and what isn’t. And by being able to better articulate what the impact is on people, we believe we can also overcome some of these current challenges that come from just not knowing exactly what is going on.

Eric White And that’s a real important part when you were specifically talking about government run health agencies, just because the leadership can change on the whim of an election or an election cycle. So yet the politicization of health information is definitely key. Is there a way to put the toothpaste back in the tube with that one? Because it seems as if it’s becoming a more segmented section of what a certain political party believes and what another one believes when it’s really just health information, not it doesn’t matter what side of the aisle you’re on.

Stefanie Friedhoff Yeah, I do believe that government communicators and scientists all want to be humble in looking at this challenge. And we have seen this in the pandemic over and over when we present a sense of over certainty, then we get into trouble. And a good scientists know that evidence can change, especially on a novel subject. So we need to really distinguish established health information from things that are still in flux and where we don’t know enough. Most people don’t understand the difference between matured science or mature evidence and cutting edge science where we’re just discovering things. It’s really important for everybody who’s communicating in this space to make the difference very clear. Some things we know for sure. We know that vaccines work and how and we can explain those. And some things we’re not so sure. We may have a new vaccine and we’re just testing out how it is working and on which populations. And those things need to be very clearly shared with the public to maintain trust. One thing we should be mindful of, both as researchers and as communicators, is that as long as we don’t have a strong evidence base that shows us how and when misinformation is impacting people and the health behaviors, we want to be mindful and not overreach in claiming such impact.

 

Copyright © 2024 Federal News Network. All rights reserved. This website is not intended for users located within the European Economic Area.

Related Stories

    Virus Outbreak South Korea

    What can be done in the public-health sector to combat misinformation?

    Read more
    mental health care

    FDA looks to keep tabs on ‘common efforts’ to improve public health through master data management

    Read more
    Virus Outbreak South Korea

    What can be done in the public-health sector to combat misinformation?

    Read more