Users are shown inoculating messages via interstitial, pop-up, &c., that either:
The messages can be set to appear whenever users search for terms or hashtags related to artificially bifurcated, or polarizing, issues (e.g. global warming, autism, vaccines, &c.) or for anticipated narratives (e.g., election fraud).
Users who are shown the "prebunking" message are less likely to believe, interact with, or spread misinformation on the topic.
Without the need to make a judgement about other content, this intervention allows to infuse scientifically grounded facts into the discussion prior to the spread of the false information, as well as lays open common argumentation techniques (independent of the specific misinformation), thereby makes the audience on social media more resilient to a broad spectrum of misinformation.
In a review published in the European Review of Social Psychology, researchers explored proactive measures to prevent misinformation from finding traction in the first place, using the psychological theory of “inoculation” as a basis.
Inoculation rests on the idea that if people are forewarned that they might be misinformed and are exposed to weakened examples of the ways in which they might be misled, they will become more immune to misinformation.
The researchers then reviewed a number of techniques that can boost people’s resilience to misinformation. The interventions ranged from general warnings to more specific instructions about misleading—i.e, rhetorical—techniques.
Their research show that, based on the available evidence, inoculation appears to be a promising avenue to help protect people from misinformation and “fake news”.
There has been increasing concern with the growing infusion of misinformation, or “fake news”, into public discourse and politics in many western democracies. Inoculation appears to be a promising avenue to help protect people from their damaging effects.
Misinformation can undermine a well-functioning democracy. It can also inhibit the passage of must-pass policy initiatives, such as measures to counter human-caused global warming.
It helps all of us to have user interfaces that preemptively inoculate users from the ways that science can be deliberately misconstrued.
Stephan Lewandowsky, Sander van der Linden
February 22, 2021
John Cook, Stephan Lewandowsky, Ullrich K. H. Ecker
May 5, 2017
Do you think this intervention could have more benefits, unacknowledged drawbacks, or other inaccuracies that we've neglected to mention here?
We always welcome more evidence and rigorous research to back up, debunk, or augment what we know.
If you want to be a part of that effort, we'd love to have your help!Email us
Make new users more comfortable in community and more resilient to trolling