The quest to design an ethical social media platform

A number of innovators are working to introduce ethical principles into social media design. But what does an ethical social media platform actually look like?

Mastodons.
(Image credit: Screenshot/Mastodon)

Social media use is pervasive in our culture. And it's on the rise. At the start of this year, there were almost 3.2 billion people using social networks worldwide, up 13 percent from 2017. There are more than 11 new users every second. Meanwhile, we're learning about the damage excessive social media use can do to our health and our society. As awareness of the pitfalls of being constantly connected grows, a small number of tech professionals are working to introduce ethical principles into social media design. But what does an ethical social media platform actually look like?

Our devices — and the social media platforms they facilitate — are clearly messing with our brains: One report published earlier this year linked the rise of smartphones to depression in teenagers. A study published in the American Journal of Preventive Medicine found that heavy social media users were twice as likely to report feelings of social isolation. Social media harbors trolls, spreads misinformation, and collects massive amounts of user data and uses it in dubious ways. One former Facebook executive made headlines last year when he claimed that the "short-term, dopamine-driven feedback loops" created by the company are "destroying how society works."

For aspiring ethical designers, the first step will be identifying a new way to make money. Ads are the traditional funding source for social platforms; they take users' personal data and serve it to advertisers who want their ads to reach a specific audience. This virtually ensures a fundamentally exploitative business model based on surveillance, says Laura Kalbag, a designer and the co-founder of digital justice not-for-profit Ind.ie.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

"We have this saying in the tech industry: If you aren't paying for the product, then you're the product being sold," Kalbag says. "Most mainstream technology companies want to extract as much information about their users as possible. Because this is the model, success is contingent on more people using a platform for longer periods. In the industry, this is referred to as engagement, but really that's a euphemism for addiction."

And what do you get when an industry is built on a model of addiction? Products that are designed to manipulate a user's mind. Today's most popular social apps take advantage of our most basic pleasure-seeking impulses to encourage compulsive use. Human brains release the feel-good neurotransmitter dopamine during certain activities, including eating, exercising, and socializing. On an evolutionary level, this is the body's way of rewarding life-sustaining actions and encouraging us to repeat them. Tech companies know social affirmation gives people a buzz, which is why they constantly notify users of every like, share, or retweet. Fear of missing out on an important event or announcement also keeps users coming back for more. And infinite scroll is just as bad.

"Infinite scroll basically eradicates the stopping cues that usually tell us to move on to something else," explains Adam Alter, the author of Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked. "Traditionally, when you got to the bottom of a particular screen you had to click to release more information. With infinite scroll there is no click so your default is to just continue ad infinitum."

One company attempting to break away from an ad-based business model and the bad behavior it encourages is Mastodon. It looks and functions a lot like Twitter, though there are key differences. Mastodon is a decentralized "federation" made up of different user-created channels called "instances." Each instance has its own rules, moderation policies, and administrators. The microblogging site is funded entirely by donations, not advertising or venture capital.

But the reality is, collecting user data is big business. Realistically, profit-making enterprises aren't going to stop their spying to protect human rights, at least not anytime soon. To make tech more ethical, Kalbag suggests we might have to start viewing social platforms more like a public utility and fund them accordingly. Facebook, for instance, has become a vital tool for spreading information across communities, like telephone wires and public television before it. In many places, these kinds of broadcasting and infrastructure projects were paid for through taxes or offered government incentives. "We need organizations that make money in a different way, or that don't make money because they receive common funding," Kalbag says. "Nowadays communication platforms are important infrastructure, so maybe they could be funded by taxes, as long as they aren't controlled by governments."

Alter also suggests that regulators might step in to try and limit the damage that addictive social platforms could inflict on society. "[Tech companies'] primary goal is profit-making even at the expense of consumer welfare," he says. "Industries that pollute the world in the service of profit-making are fined or toppled, and we could do the same to tech companies: If you use a feature that makes your product more addictive, you donate thousands or millions to fund addiction treatment centers."

What about trolls? How do we control for the bullies that lurk on social media, hiding behind an anonymous handle? And how do we stop the spread of misinformation? Surely those things are part of what lowers the ethical standards of sites like Facebook and Twitter.

Advertising-dependent platforms aren't likely to ban controversial users or remove inflammatory posts because of the engagement (and revenue) they generate. Some techies believe that decentralized platforms like Mastodon, which aren't run or controlled by a single corporate entity, are part of the solution. Mastodon's founder, Eugen Rochko, believes his platform's smaller and more tight-knit communities will be less likely to tolerate the kinds of toxic behavior — trolls and harassment — that have plagued sites like Twitter.

Diaspora* and Friendica are two Facebook-style social networks that are also made up of smaller sites linked together through federation. The idea is to make social networks more like community-run forums. Under this model, it's easier to contain toxic speech and ideology. Even so, Diaspora faced controversy in 2014 when Islamic State fighters set up accounts on the network to promote the group's activities. In that instance, the network relied on its administrators to remove the posts, as its decentralized architecture meant that the platform's creators couldn't step in themselves. It's a sticky situation that proves that there's no simple solution for removing extremist content from the social web.

Other key aspects of ethical social media design are choice and transparency. Facebook, Twitter, and Instagram all feature algorithmic timelines, which use machine learning to decide what posts individuals will see first. But ethical design means users must be able to define their relationship with a platform — not the other way around. This could mean selecting what sort of notifications they want to receive, or choosing to receive tailored product recommendations. Mastodon uses a chronological timeline that displays posts in the order in which they were created, and gives users greater control over what content they see and when. "We need to have more fine-grain control over what we see on these sites, and the best way to do that is to allow us to opt into things," Kalbag says. "Platforms should have sensible, privacy-respecting default settings."

Ethical social media design seems to have tapped into a vein: Earlier this year, Rochko told Esquire that Mastodon had added 80,000 new members in one week, bringing its total user base to 1.5 million and making it the most popular Twitter alternative. But there's a long way to go to usurp the social media giants. It's up to designers to imagine a real, usable alternative to toxic platforms — and it's up to governments and decision makers to hold tech companies to account.

To continue reading this article...
Continue reading this article and get limited website access each month.
Get unlimited website access, exclusive newsletters plus much more.
Cancel or pause at any time.
Already a subscriber to The Week?
Not sure which email you used for your subscription? Contact us
Jennifer Johnson

Jennifer Johnson is a writer and reporter based in London, England. Her stories explore the intersections of science, technology, and policy.