Removal Explanations

Reduce repeated rule-breaking

Our Confidence Rating

Likely

Share This Intervention

What It Is

Provide an explanation to users when their post or comment is removed for rule infraction.

Civic Signal Being Amplified

Welcome
:
Ensure user safety

When To Use It

Reactive

What Is Its Intended Impact

By providing users an explanation for why one of their comments was removed, you can reduce the likelihood that their future comments will break a forum's rules—and thus also require removal by moderators.

Evidence That It Works

Evidence That It Works

Jhaver et al (2019) looked at the behavior of Reddit users who had at least one of their posts removed in a given subreddit over a three month period and compared how likely they were to break that subreddit's rules again (i.e. have another post removal) depending on whether:

  • They received an explanation for why their posts were removed;
  • The explanations were brief "flare" message or longer comments; and
  • The explanation comments were delivered by a bot or moderator.

They found that users who received any explanation were less likely to have another post removed, and even less likely if the explanation was delivered as an extensive comment as opposed to a brief "flare". It made no difference, however, if a message was delivered by a bot or a moderator.

The limitation of the Jhaver et al (2019) study is that it is observational, so we cannot know if there is a confounding variable that may explain why explanation receivers are less likely to re-offend (e.g. the types of subreddits that provide explanations may also tend to have users who respond positively to any removal). Two additional studies, however, do use quasi-experimental methods to study a related intervention, the effect of removing a post, and find that post-removals decrease the likelihood a user will re-offend. (Horta Ribeiro et al., 2022; Srivinasan et al., 2019) Although neither of those studies directly examines removal explanations, both note that users are given explanations for why their content was deleted. The combination of Jhaver et al's observational finding about the effect of explanations and those two natural experiments suggests to us that explanations are a promising intervention. See: Removal Rule-breaking Content.

It is important to note that Srinivasan et al. did not see a decrease in toxic content; this suggests that removals and explanations are an effective way to make users aware of the specific rules of a platform rather than to disincentivize toxic behavior more broadly.

In an additional study, Jhaver et al. (2024) look at the effect of removal explanations on “bystanders” - i.e. people who did not have their own comment removed but who witnessed a removal explanation on Reddit. We see tentative evidence in that study that witnessing removal explanations can increase engagement, although it does not have an apparent effect on reducing rule-breaking behavior. 

Why It Matters

Special Considerations

Explanations may only be effective at reducing repeat rule-breaking in forums that have strict or unusual rules (i.e. not more common rules against toxic behavior), as they serve to teach new users about rules they may not be aware of.

It is possible that explanations are effective only if they also include information about why a post was removed as well as an appeal process and if the forum has a "three strikes" policy.

Examples

This intervention entry currently lacks photographic evidence (screencaps, &c.)

Citations

Does Transparency in Moderation Really Matter?: User Behavior After Content Removal Explanations on Reddit

Authors

Shagun Jhaver, A. Bruckman, Eric Gilbert

Journal

Proceedings of the ACM on Human-Computer Interaction

Date Published

November 7, 2019

Paper ID (DOI, arXIV, &c.)

10.1145/3359252

Content Removal as a Moderation Strategy

Authors

Kumar Bhargav Srinivasan, Cristian Danescu-Niculescu-Mizil, Lillian Lee, Chenhao Tan

Journal

Proceedings of the ACM on Human-Computer Interaction

Date Published

November 7, 2019

Paper ID (DOI, arXIV, &c.)

10.1145/3359265

Automated Content Moderation Increases Adherence to Community Guidelines

Authors

Manoel Horta Ribeiro, Justin Cheng, Robert West

Journal

ArXiV

Date Published

October 19, 2022

Paper ID (DOI, arXIV, &c.)

10.48550/arXiv.2210.10454

Bystanders of online moderation: Examining the effects of witnessing post-removal explanations.

Authors

Jhaver, Shagun, Himanshu Rathi, and Koustuv Saha.

Journal

Proceedings of the 2024 CHI Conference on Human Factors in Computing Systems

Date Published

May 11, 2024

Paper ID (DOI, arXIV, &c.)

Citing This Entry

Prosocial Design Network (2024). Digital Intervention Library. Prosocial Design Network [Digital resource]. https://doi.org/10.17605/OSF.IO/Q4RMB

Entry Last Modified

September 9, 2025 7:37 AM
Back to Library