This simple intervention is a message for new users of a platform or new members of an online community.
It has three key elements:
This intervention has been shown to:
Where this intervention was used, the increase in retention was limited to those who actively identified with the mission of the community.
Retention did not increase among those revealed not to be joining in good faith.
This can appear as an automated comment to a new user's post, as an automated personal message, or as an interstitial upon entry.
Researchers at Cornell's Citizen and Technology Lab conducted a two-year project with r/feminism on Reddit to test an idea for reducing harassment.
They found that "messages explaining that harassers were a minority increased newcomer comments by 20% on average, an effect that persisted across the full 10 weeks we collected data (n=1,300; p=0.002)."
This intervention could be useful for reassuring and empowering marginalized voices who would otherwise be silenced by hostility.
This could also help with polarization as it helps newcomers and outsiders to an online community more accurately see hostile users as the minority they are, rather than seeing them as representative of the entire community and treating that community accordingly.
A grade of Convincing is reserved for interventions for which the majority of evidence is peer-reviewed experiments that have yet to undergo attempts at replication.
This is the second highest grade that we give interventions.
Do you think this intervention could have more benefits, unacknowledged drawbacks, or other inaccuracies that we've neglected to mention here?
We always welcome more evidence and rigorous research to back up, debunk, or augment what we know.
If you want to be a part of that effort, we'd love to have your help!Email us
Cuts down on trolling behavior without shaming users
Reduces toxicity of comments