This is an interstitial popup or similar linked to a "share" or "retweet" or equivalent that detects whether or not you've clicked the link (and possibly spent any time on the link, if that's technically feasible to track). If you haven't, then the platform does not allow you to share the link.
This makes people think about the content of what they are sharing, and therefore more likely to decide not to share things that look questionable upon further examination (headlines that aren't actually backed up by the content of the article, for example). Since misinformation is more easily identified by reading a full article than just be seeing a snippet or headline, this makes it likely to cut down on the sharing of misinformation.
This interstitial appears whenever a user is about to share a link on the app without having opened the link.
Twitter has started doing this and has tweeted short claims to its effectiveness. Since this is a promotion of their own product, we would like to see more detailed data and methodology, ideally from an independent source.
A 2016 study from computer scientists at Columbia University and Microsoft found that 59% of links posted on Twitter are never clicked. The creation of user friction in these cases should, as Twitter has deduced, allow for more informed discussions and preempt the spread of hoaxes and disinformation.
Do you think this intervention could have more benefits, unacknowledged drawbacks, or other inaccuracies that we've neglected to mention here?
We always welcome more evidence and rigorous research to back up, debunk, or augment what we know.
If you want to be a part of that effort, we'd love to have your help!Email us