An illustration showing two mobile newsfeeds coming together.

Increasing exposure to counter-partisan content

Reduce inter-party animosity

Our confidence rating

Mixed

Share This Intervention

What It Is

Increasing exposure to counter-partisan content either by adjusting ranking algorithms or nudging users to follow cross-party accounts.

Civic Signal Being Amplified

Connect
:
Build bridges between groups

When To Use It

Proactive

What Is Its Intended Impact

By exposing users to sources that run counter to their own beliefs, users will see counter-partisans more favorably, reducing conflict and negative attitudes towards the other side.

Evidence That It Works

Evidence That It Works

Three field experiments examine the effects of exposing users to more counter-partisan news  with mixed results: one finds a decrease in partisan animosity (Levy, 2021), another observes a “boomerang” effect that pushes users to be more partisan in their views (Bail et al., 2019), and a third sees no changes in partisan animosity or political views (Nyhan et al., 2023). A fourth quasi field experiment (Casas et al., 2023) pressure tests the “boomerang” effect and finds none. 

Levy (2021) tested the effect of nudging Facebook users to subscribe to out-partisan news accounts and observed a positive effect in reducing partisan animosity. One-third of participants were not asked to follow any accounts (control group), and two-thirds of participants were asked if they wanted to "like'' any of four Facebook news accounts either from their side of the political spectrum or the opposite side (treatment group). Participants in the study who were given the opportunity to follow a counter-partisan account expressed less dislike toward counter-partisans than those in the control group and those who were offered a chance to follow in-party news sources. The size of the effect was small (only 0.03 standard deviations), and it is unclear whether the intervention is mostly effective on moderates, who may already have diverse news preferences on average. (Note: we report effect sizes using the metrics in the authors' paper. All effects we include are statistically significant, unless otherwise stated.)

A similar study from Bail et al. (2019), however, found a “boomerang” effect on participants' political views. In that study, Twitter users were offered $11 to follow a bot that retweets counter-partisan posts. Those exposed to the counter-partisan bot reacted by becoming more entrenched in their own views compared to users not nudged to follow a bot. We suspect that the difference in findings between Bail et al. (2019) and Levy (2021) is related to the quality of counter-partisan information users saw: Levy (2021) only offered four news sources that, while strongly partisan, were not clearly disreputable; the bot in Bail et al. (2019), however, retweeted from over 4,000 political accounts, limiting the ability to compare the interventions evenly. 

In a collaborative field experiment with Facebook, Nyhan et. al (2023) studied how decreasing exposure to “like-minded sources” would influence user attitudes towards out-partisans and the extremity of their political opinions. Participants were randomly assigned to either no change in their feed (control group), or a feed with 17 percentage points fewer like-minded sources (treatment group). Users with a reduction in like-minded sources showed no change in their perceptions of out-partisans or their ideological extremity compared to participants with a standard feed. These findings suggest that algorithmic changes to curb the political similarity of user feeds have little to no effect on user polarization. 

 In a quasi field experiment from Casas et al. (2023) that aimed in part to reproduce the boomerang effect in Bail et al. (2019), participants were incentivized to read and engage with ideologically extreme out-party news sources for twelve days.This increased exposure did not meaningfully polarize participants; there was no change in participants’ policy views, or their dislike for out-partisans, compared to participants who were not assigned to read extreme out-party news. While this finding somewhat contradicts the Bail et al. (2019) finding, we note the design of Casas et al. (2023) uses verified news sources rather than counter-partisan Twitter posts. 

Why It Matters

While strong differences in political views are an inevitable part of democracy, political theorists argue that extreme animosity between groups can be a threat to pluralism by inspiring intolerance and win-at-all-costs attitudes. Exposing social media users to a more diverse set of news can be a tool that reduces inter-partisan ill will, although as the evidence above demonstrates it can also have the backfiring effect of making individuals more entrenched in their views.

Special Considerations

Nyhan et al. (2023) also studied browsing behavior of participants on Facebook and found that down-ranking like-minded sources did not prevent users from seeking them out. As such, algorithmic interventions do not prevent users from seeking more congenial news. Also, even though one study (Casas et al., 2023) found no boomerang effects with exposure to extreme sources, because of the backfire effects seen in Bail et al. (2019), we advise that in using this intervention only moderate, i.e. non extreme, news sources are recommended to users.

Examples

This intervention entry currently lacks photographic evidence (screencaps, &c.)

Citations

Exposure to opposing views on social media can increase political polarization

Christopher A. Bail, Lisa P. Argyle, Taylor W. Brown, & al.
PNAS
August 28, 2018
10.1073/pnas.1804840115

Social Media, News Consumption, and Polarization: Evidence from a Field Experiment

Ro’ee Levy
American Economic Review
August 19, 2020

Like-minded sources on Facebook are prevalent but not polarizing

Nature

Exposure to Extremely Partisan News from the Other Political Side Shows Scarce Boomerang Effects.

Casas, A., Menchen-Trevino, E. & Wojcieszak, M.
Political Behavior
February 3, 2022

Is this intervention missing something?

You can help us! Click here to contribute a citation, example, or correction.

Further reading

Back to Prosocial Design Library