What It Is
This is a moderation tool that allows mods to keep disputed content online while not making it easily visible or promoting it.
The content would be blurred out or covered by a warning that this content contains sensitive or otherwise objectionable material (and the nature of why it's being covered), with a button to reveal that material.
This could include misinformation from important political figures, or images that are explicit but newsworthy or culturally important.
This could also allows regional distinctions so that something labeled by one region as sensitive could be automatically shown in other regions (e.g. proscribed depictions of religious figures, symbols that are legally considered hate speech in some countries, &c.).
The reveal buttons can also be disabled for users who are not yet adults.
When To Use It
What Is Its Intended Impact
- Creates a middle ground for cases where user-uploaded content shouldn't universally be left as-is or removed.
- Allows for nuanced conversations to happen among those who consent to being a part of them without subjecting users who feel it's inappropriate or unhealthy for them in particular.
- Cuts down on unnecessary conflict as to whether or not content should be allowed or banned from a platform.
How We Know It Works
How We Know It Might Work
Why It Matters
This intervention entry currently lacks academic citations.
Is this intervention missing something?
You can help us! Click here to contribute a citation, example, or correction.