Media release
From:
Watching 5 short videos designed to inoculate viewers against online misinformation improved participants’ abilities to identify manipulation techniques in both a laboratory setting and in a field study, according to a new study involving nearly 30,000 participants. The analysis, which involved 7 high-powered preregistered studies, suggests that such videos could be run as public service ads before potentially harmful content online to reduce millions of people’s susceptibility to misinformation. “Inoculation theory” suggests that educating people about the threat of impending misinformation may help them build psychological resistance against it – similar to how vaccinations can prepare the body to resist pathogens. However, it has been unclear whether such an approach could succeed on a large scale. To investigate, Jon Roozenbeek and colleagues created a series of short (~1.5-minute) inoculation videos covering 5 manipulation techniques commonly encountered in online misinformation: emotionally manipulative rhetoric designed to evoke strong emotions, incoherent or mutually exclusive arguments, false dichotomies or dilemmas, scapegoats, and attacks against a person rather than the position they maintain. To avoid the appearance of partisan bias, the videos were all nonpolitical, fictitious, and humorous. The researchers ran 6 randomized controlled studies (including a replication study) in a lab setting, to test the 5 videos with more than 6,000 participants. Each watched either an inoculation video or a neutral control video before rating 10 synthetic social media posts that were either manipulative or neutral. Next, Roozenbeek et al. ran 2 of the videos in an ad campaign on YouTube, where they gathered data for 22,632 people. Within 24 hours of viewing the ads, one-third of participants exposed to them were randomly tested on their ability to recognize the manipulation technique used in a headline that appeared in a test question on YouTube. Altogether, the findings suggest that the inoculation videos improved people’s abilities to identify manipulation techniques and appeared to have a larger effect than other existing scalable interventions, such as short text segments that point out ways to spot false news.
Expert Reaction
These comments have been collated by the Science Media Centre to provide a variety of expert perspectives on this issue. Feel free to use these quotes in your stories. Views expressed are the personal opinions of the experts named. They do not represent the views of the SMC or any other organisation unless specifically stated.
Associate Professor Stephen Hill
In a series of seven studies, including one that used an ad campaign on YouTube, researchers showed that ‘psychological inoculation’ against common manipulation techniques (such as the use of logical fallacies and emotionally manipulative language) reduced people’s willingness to share untrustworthy social media content and increased their ability to detect it.
This research continues the excellent work carried out by this team which has led to creation of a number of useful resources for understanding and preventing the spread of harmful misinformation. It provides some hope that it may be possible to reverse, what many people perceive to be, the recent increase in the production and sharing of misinformation. As the 2021 report from Te Mana Whakaatu (the Classification Office) shows, New Zealand is in no way immune to these international trends.
What the research doesn’t tell us is whether improving people’s ability to detect the use of manipulative techniques will reduce the likelihood that their beliefs will be swayed by (or reinforced by) the untrustworthy content of the message. You’d hope that they would, but other research shows that people are often more critical of the quality of arguments for messages that are contrary to their existing beliefs than for those that align with them. The good news is that if inoculation reduces the sharing of misinformation there will be less opportunity for other people to be swayed by it. As is often the case with preventive medicine, the challenge will be to persuade people to get inoculated.
Dr Jagadish Thaker, is a Senior Lecturer at the University of Auckland
We are facing several political and social crises due to the spread of online misinformation. It has impacted our ability to fight the COVID-19 pandemic and has increased social strife. So far, education campaigns about misinformation and social media giants’ efforts to tackle online misinformation have had a mixed impact. As a result, finding new ways to help people build resilience against misinformation and disinformation that is also easily scalable and cost-effective is essential.
Past research shows that it is challenging to change our minds once exposed to misinformation. Instead, prebunking or inoculating people with some of the standard techniques used to manipulate information, before being exposed to one, can be more effective in building public resilience against misinformation.
Just like a vaccine, inoculation theory argues that we can build resistance to misinformation through prior exposure to weak doses of misinformation. Think of it as building a neural memory, akin to muscle memory, to spot misinformation.
Using well-designed lab experiments and a real-world experiment on YouTube, researchers at the University of Cambridge and their colleagues found that participants could recognise commonly used manipulation techniques after exposure to prebunking videos. These prebunking videos first warned about an impending misinformation attack, informed the participants about the manipulation technique, and finally provided a funny 'microdose' of the misinformation technique.
They tested five commonly used misinformation techniques: the use of emotionally charged manipulative language to evoke a strong response, incoherence, false dichotomies, scapegoating, and attacking a person instead of discussing ideas.
Exposure to such inoculation videos also helped boost confidence in spotting misinformation techniques. It helped people to judge the content better and be more careful about sharing it online. This inoculation was effective for everyone, indicating that such a campaign is likely to be effective despite differences such as age, gender, political beliefs, use of social media, numeracy skills, and others. It is an impressive finding.
Moreover, it cost the researchers as little as $0.05 per view on YouTube, indicating that running such campaigns can be a cost-effective policy and easily implementable at scale.
However, we need more research to check how long such misinformation-spotting antibodies last in our minds. Just like new variants of COVID-19 have made vaccines less effective, do new variants of misinformation or disinformation make the previous inoculation less effective?
All examples used in this research were non-political and fictitious. People may respond differently to misinformation about topics they hold close to their hearts, such as politics, religion, and 'our way of life'. It is a US-based study, and it is essential to test the findings in other countries, paying attention to the role of state and non-state actors fueling misinformation in society.
Government and local institutions could learn and apply such communication campaigns to help the public be alert to misinformation. Social media companies should equally share this responsibility in an era of hyper-individualized social media use. As the researchers rightly note, social media giants should share their data with researchers to help develop evidence-based policies on misinformation.