Prosocial Design Network co-founders, John Fallot and Joel Putnam, sat down for an interview with Kristina Rapacki early in 2021. This interview is now featured in the June 24, 2021 article of 'The Reboot' titled 'Fixing Tech’s Design Problem'.
The article in its entirety may be read here:
The Reboot | Fixing Tech’s Design Problem
Below is the article segment in which John and Joel discuss the scope of their work with the Prosocial Design Network. It has been edited for clarity.
John Fallot and Joel Putnam created the Prosocial Design Network in 2020 after they met at a talk-back session for the Center of Humane Technology’s Your Undivided Attention podcast. Their design approach centers on smaller fixes to existing platforms.
“I really like what [the Center for Humane Technology] does in terms of flagging possible harms that are prevalent through the current business model of large tech companies,” says Putnam. “But we’re specifically interested in what it is that causes human-to-human interaction to go sideways where it wouldn’t in person.”
Putnam and Fallot have compiled a growing library of design interventions that have been shown to improve discourse on social media. (There is a whole grading system, from “[inference]” to “validated,” indicating how much evidence there is for the success of each feature.) One example is “disallow sharing of unclicked links,” which is currently being trialed by both Twitter and Facebook. (A 2016 Columbia University study estimated that 59 percent of links posted on Twitter had not been opened prior to sharing.) Others include “messages prebunking misinformation on controversial topics,” and, in a separate tab for untested interventions, “delay-by-default posting,” which gives users a moment’s pause before publishing a post or comment.
“There are some situations where it makes sense to add some friction, especially in the case of social interactions,” says Fallot, who is a certified UX designer.
Fallot and Putnam acknowledge that introducing such friction is often not in the short-term interest of tech corporations, but they also suggest that designers are in a position to push back against unhealthy models by making a business argument.
“People who use a site that feels good to be a part of are more likely to be long-term users, feel better about their use of it, and encourage their friends to join it,” says Putnam. “I think [the tech industry] is noticing that and starting to make a case for it.”
While this may be the case for some companies, there are still shocking instances of business interests trumping prosocial design interventions. Putnam and Fallot point to the example, in the lead-up to the 2020 US presidential elections, of Facebook tweaking its recommendation algorithm to dial down the vitriol in favor of more reliable news sources. After a few weeks, the company’s engineers wanted to make the “nicer” algorithm permanent, but it was ultimately rolled back.
“They ended up switching it off,” explains Putnam, “because, while people were happier with it, they didn’t spend as much time scrolling through Facebook.”
Design decisions can potentially wield enormous power. Consider the recent kerfuffle between Apple and Facebook over data privacy permissions, and how the introduction of a clear opt-out choice reveals the data brokerage model to be exceedingly brittle. But in order for interventions such as those proposed by the Prosocial Design Network to be introduced, designers need agency and support within the businesses for which they work.
Here, Fallot is optimistic about what he describes as an ethical turn in Silicon Valley investment. “Impact investing in corporations and companies that pursue an additional ethical return on investment,” he says. “Those offer an additional financial return [compared to baseline companies, suggesting that prosocial design might prove viable in this respect.]"
The Prosocial Design Network researches and promotes prosocial design: evidence-based design practices that bring out the best in human nature online. Learn more at prosocialdesign.org.