A mockup of a phone screen with this intervention, explaining that "You may encounter unverified claims on this topic." The icon is of an ear on a shield, representing the idea of 'protecting your ears from rumors.'

Messages prebunking misinformation on controversial topics

Innoculates users against misinformation

Our confidence rating

Validated

How do our ratings work?

Share This Intervention

Back To Top

What It Is

Users are shown inoculating messages via interstitial, pop-up, &c., that either:

  1. Explain the flawed argumentation technique used in common misinformation; or
  2. Highlight the scientific consensus on an issue, such as climate change

When To Use It

The messages can be set to appear whenever users search for terms or hashtags related to artificially bifurcated, or polarizing, issues (e.g. global warming, autism, vaccines, &c.) or for anticipated narratives (e.g., election fraud).

What Is Its Intended Impact

Users who are shown the "prebunking" message are less likely to believe, interact with, or spread misinformation on the topic.

Without the need to make a judgement about other content, this intervention allows to infuse scientifically grounded facts into the discussion prior to the spread of the false information, as well as lays open common argumentation techniques (independent of the specific misinformation), thereby makes the audience on social media more resilient to a broad spectrum of misinformation.

How We Know It Works

How We Know It Might Work

In a review published in the European Review of Social Psychology, researchers explored proactive measures to prevent misinformation from finding traction in the first place, using the psychological theory of “inoculation” as a basis.

Inoculation rests on the idea that if people are forewarned that they might be misinformed and are exposed to weakened examples of the ways in which they might be misled, they will become more immune to misinformation.

The researchers then reviewed a number of techniques that can boost people’s resilience to misinformation. The interventions ranged from general warnings to more specific instructions about misleading—i.e, rhetorical—techniques.

Their research show that, based on the available evidence, inoculation appears to be a promising avenue to help protect people from misinformation and “fake news”.

Why It Matters

There has been increasing concern with the growing infusion of misinformation, or “fake news”, into public discourse and politics in many western democracies. Inoculation appears to be a promising avenue to help protect people from their damaging effects.

Misinformation can undermine a well-functioning democracy. It can also inhibit the passage of must-pass policy initiatives, such as measures to counter human-caused global warming.

It helps all of us to have user interfaces that preemptively inoculate users from the ways that science can be deliberately misconstrued.

Trying to debunk misinformation after it has spread is like shutting the barn door after the horse has bolted. By prebunking, we aim to stop the spread of fake news in the first place.
Dr. Sander van der Linden, Director of the Cambridge Social Decision-Making Lab and senior author of the new study.

Special Considerations

Examples

This intervention entry currently lacks photographic evidence (screencaps, &c.)

Citations

European Review of Social Psychology

Countering Misinformation and Fake News Through Inoculation and Prebunking

Stephan Lewandowsky, Sander van der Linden
February 22, 2021
DOI:
10.1080/10463283.2021.1876983
PLoS ONE

Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence

John Cook, Stephan Lewandowsky, Ullrich K. H. Ecker
May 5, 2017
DOI:
10.1371/journal.pone.0175799

Is this intervention missing something?

You can help us! Click here to contribute a citation, example, or correction.

Further reading

Back to Prosocial Design Library