What It Is
Suggesting to social media users that they follow (non-extreme) news sources from the opposite political side.
When To Use It
On social media platforms where users can follow news accounts.
What Is Its Intended Impact
By exposing users to sources that run counter to their beliefs, those users will see counter-partisans more favorably.
Evidence That It Works
Evidence That It Works
Levy (2021) conducted a field experiment in which Facebook users first completed a survey about their political views and feelings about counter-partisans, and then two-thirds of the participants in the study (the treatment groups) were asked if they wanted to "like" any of four Facebook news accounts either from their side of the political spectrum or the opposite side. A third of the participants weren't asked to follow any accounts (the control group). Surveyed again after a period of time, participants in the study who were given the opportunity to follow a counter-partisan account expressed less dislike toward counter-partisans than those in the control group and those who were asked if they wanted to follow an account that shared their political leaning. Importantly, the author saw this effect looking at all participants who were given the offer to follow a new account, not just those who took him up on the offer, making this a more robust finding.
It should be noted that the reduction in animosity was small; partisan animosity among those offered to follow a counter-partisan news site only decreased by 0.03 standard deviations; a rule of thumb is that in order to be meaningful, an intervention would have an effect of at least 0.2 standard deviations. It is also unclear whether the intervention affects mostly or only moderates, who may be more open to following diverse news and changing their views about counter-partisans, or if there is also a positive effect on those with the strongest levels of counter-partisan animosity.
Also important to note is that Levy (2021) did not find the intervention made participants' political opinions any more moderate. A similar study (Bail, 2019), moreover, found a backfire effect on participants' political views. In that study, Twitter users were offered $11 to follow a bot that retweeted counter-partisan posts. Bail (2019) did not survey participants about partisan animosity, but he did ask about their political views and found that those exposed to the counter-partisan bot reacted by becoming more entrenched in their own views.
While we do not know why Bail (2019) found a backfire effect and Levy (2021) only saw a null effect, we suspect that the difference in findings is related to the quality of counter-partisan information users saw: Levy (2021) only offered four news sources that, while strongly partisan, were not clearly disreputable; the bot in Bail (2019), however, retweeted from over 4,000 political accounts. If extreme accounts were among those 4,000, they could have the effect of making counter-partisans appear more extreme and threatening, thus entrenching users' views.
Why It Matters
While strong differences in political views are an inevitable part of democracy, political theorists argue that extreme animosity between groups can be a threat to pluralism by inspiring intolerance and win-at-all-costs attitudes. Exposing social media users to a more diverse set of news can be a tool that reduces inter-partisan ill will.
Special Considerations
Because of the backfire effects of Bail (2019), we advise that in using this intervention only moderate, i.e. non extreme, news sources are recommended to users.
Examples
This intervention entry currently lacks photographic evidence (screencaps, &c.)
Citations
Exposure to opposing views on social media can increase political polarization
Social Media, News Consumption, and Polarization: Evidence from a Field Experiment
Is this intervention missing something?
You can help us! Click here to contribute a citation, example, or correction.