Pros and cons of social media content moderation

Where do you draw the line between online safety and freedom of speech?

Facebook, Twitter, Instagram, and TikTok
Both sides of the political divide have heavily scrutinized Big Tech over content removal policies in the past
(Image credit: Illustrated / Getty Images)

The Supreme Court recently agreed to take on two cases at the center of an ongoing content moderation debate, where it will decide "whether states can essentially control how social media companies operate," CNN reported. The justices will be considering laws passed in Texas and Florida in 2021 that "could have nationwide repercussions for how social media — and all websites — display user-generated content," the outlet added. 

Both sides of the political divide have heavily scrutinized Big Tech and their policies over content removal in the past. Democrats have pushed for more moderation of user-generated content, while Republicans claim that social media companies are overstepping and excessively targeting content from the conservative right, an allegation former President Donald Trump has repeated several times. 

Pro: It protects the public from harmful content and misinformation

For some, content moderation is the first line of defense against spreading misinformation and content that could be harmful to users. It can protect young consumers from cyberbullying or flag content that spreads misinformation, which became a point of contention at the height of Covid-19.  “Content moderation is really about human safety," argued Alexandra Popken, VP of Trust and Safety of WebPurify, a content moderation company. The goal of moderation is "to proactively detect and remove harms before they materialize and impact real people," Popken added, "or to respond and react as quickly as possible once they have materialized.”

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

Con: It negatively impacts the mental health of moderators

Those tasked with scouring thousands of potentially jarring posts, images and videos have complained that the work hurts their mental health and that companies don't offer adequate resources to help those suffering as such. In June, hundreds of social media moderators for outsourcer TELUS International in Germany called on lawmakers to improve their work conditions, "citing tough targets and mental health issues," Reuters reported. They were "led to believe the company had appropriate mental health support in place, but it doesn't. It's more like coaching," Cengiz Haksöz, a former content moderator at TELUS International, told the outlet. "And these outsourcers are helping the tech giants get away from their responsibilities."

Pro: It protects a company's brand

Content moderation enables brands to control their reputation, protecting them from inflammatory content that could harm their users or alienate them from advertisers. "Illicit submissions that unalign with a brand’s values can quickly turn products intended to spread positivity into something far more sinister," Jonathan Freger, co-founder and CTO of WebPurify, said in Forbes. A lack of content moderation "allows for harmful UGC to slip through the cracks and threaten the user experience and brand reputation," he added. 

Con: It opens the door for 'digital authoritarianism'

Content moderation has snowballed, and the "collateral damage in its path" was ignored, Evelyn Douek said in Wired. And the push for more moderation in the U.S. has had "geopolitical costs, too"; some authoritarian governments "pointed to the rhetoric of liberal democracies in justifying their own censorship." Western governments have "largely left platforms to fend for themselves in the global rise of digital authoritarianism," she noted. "Governments need to walk and chew gum in how they talk about platform regulation and free speech if they want to stand up for the rights of the many users outside their borders."

Pro: It puts the onus on social media companies to keep platforms safe

Not that we don't benefit from using our discretion in determining which content to interact with, but social media platforms are still responsible for fostering a safe environment for all users. Having content moderation tools and policies in place is a part of how companies assume the responsibility of safe platforms, especially for marginalized people who find solace and community online. Often, those communities can become the targets of content that threatens that safety, "which is where content moderation is crucial," Popken, WebPurify's VP of Trust and Safety, said in an interview with Tech HQ

Con: It is anti-free speech

One of the prevailing arguments against content moderation is that it is inherently anti-free speech, especially when done at the behest or under the influence of government officials. In June, a U.S. federal court issued an injunction barring the government from contacting social media platforms about moderating posts protected by the First Amendment, Quartz reported. The decision came in response to a lawsuit filed by Republican attorneys general who alleged that the government was censoring free speech on social media platforms under the guise of combatting Covid-related or election misinformation. In his ruling, Judge Terry A. Doughty wrote that the case “arguably involves the most massive attack against free speech in United States’ history.” He concluded that the evidence depicted "an almost dystopian scenario," wherein the government assumed a role "similar to an Orwellian 'Ministry of Truth.'"

Theara Coleman, The Week US

Theara Coleman has worked as a staff writer at The Week since September 2022. She frequently writes about technology, education, literature and general news. She was previously a contributing writer and assistant editor at Honeysuckle Magazine, where she covered racial politics and cannabis industry news.