How tech encoded the culture wars
Is it time for big tech to take responsibility for what is said on their platforms?
He zucked up.
Asked this week to explain why hoax site Infowars was still allowed to post to Facebook, CEO Mark Zuckerberg told Recode's Kara Swisher that the company had to give users leeway because it's difficult to impugn intent. Things took a left turn, however, when Zuckerberg used Holocaust deniers as an example, suggesting that it wasn't the platform's place to determine whether such people really intend to mislead. The blowback was fierce and immediate enough that the Facebook CEO was forced to issue a clarification only a few hours after his interview was published.
Yet, poor examples aside, this really is what the titans of tech seem to believe. And it is in some ways easy to understand. Facebook, Twitter, and other companies are simply adhering to what they see as the basic ideals of liberal democracy: fairness, equality, and free speech. As Zuckerberg stated in his clarification, "I believe that often the best way to fight offensive bad speech is with good speech."
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Sign up for The Week's Free Newsletters
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.
That same commitment to liberal ideals is also precisely what has helped tech platforms become the battle ground for a new sort of "culture war" — a Trump-era conflict that pitches progressives against those who think that concepts like racial and gender diversity represent a threat to established norms. In taking the stance that the Enlightenment values of neutrality and universalism are fit to police digital networks, tech is helping to encode the culture wars into their very DNA.
The most obvious problem is that the decision to leave up false, misleading, or outright incendiary posts allows the system to be gamed. As BuzzFeed's Charlie Warzel has convincingly argued, Facebook seems to misunderstand how its platform is deliberately misused by bad faith actors who, for example, post false, inflammatory information and then remove it after the damage is already done. By dodging shifting standards for what can and cannot be taken down, those looking to spread misinformation can do considerable harm before having their content removed.
Yet there's also something deeper at work. It is no coincidence the rise of social networks has coincided with so-called "both-sidesism," in which equal time and weight is given to opposite sides of a debate no matter how abhorrent or absurd one view might be. The most obvious example here would be President Trump's "there were very good people on both sides" comment after the white supremacist march and terrorist attack in Charlottesville. But the trend that has seen men's rights groups or racists take on the language of oppression points to the way in which a neutral or universal approach to content ends up fostering a climate in which patently awful things are talked about as if they were no different from the ordinary.
To be clear, the idea that a private company should get to determine what is true or right, especially when a company like Facebook is immensely popular, is deeply disturbing. While one might breezily claim a site like Infowars should be banned, there are countless edge cases that are less clear cut.
But as New York's Max Read points out, relying on notions of free speech and equality — key ideas to liberal democracies — are at best hypocritical when the checks and balances of a representative government are missing from private companies like Facebook and Twitter. In that regard, social media companies are more like dictatorships. In outsourcing the infrastructure for public discourse to a handful of companies on America's West Coast, we have also given up the ability to have a say through voting, policy, or other forms of pressure because we've allowed these organizations to take on a state-like function.
So what can be done? Read suggests that Facebook produce a kind of constitution to at least make the process of content removal transparent and consistent. There is also the more aggressive option of regulation, which would necessitate new laws to deal with the specificity of digital networks. More extreme would be actually breaking up these companies in order to mitigate how their scale helps produce these negative effects; considering the current regulatory climate, this option seems the least likely.
But as is clear in the world of fake news, the potential return of neo-fascism, and an increasingly polarized, manipulated public sphere, perhaps old principles are no longer as reliable as they once were, and that instead, we need to find a way to insist digital networks take responsibility for the content on their platforms.
In not doing so, Facebook, Twitter are more are simply encoding the culture wars into their DNA — and we are all worse off for it.
Sign up for Today's Best Articles in your inbox
A free daily email with the biggest news stories of the day – and the best features from TheWeek.com
Navneet Alang is a technology and culture writer based out of Toronto. His work has appeared in The Atlantic, New Republic, Globe and Mail, and Hazlitt.
-
US election: who the billionaires are backing
The Explainer More have endorsed Kamala Harris than Donald Trump, but among the 'ultra-rich' the split is more even
By Harriet Marsden, The Week UK Published
-
US election: where things stand with one week to go
The Explainer Harris' lead in the polls has been narrowing in Trump's favour, but her campaign remains 'cautiously optimistic'
By Harriet Marsden, The Week UK Published
-
Is Trump okay?
Today's Big Question Former president's mental fitness and alleged cognitive decline firmly back in the spotlight after 'bizarre' town hall event
By Harriet Marsden, The Week UK Published
-
The life and times of Kamala Harris
The Explainer The vice-president is narrowly leading the race to become the next US president. How did she get to where she is now?
By The Week UK Published
-
Will 'weirdly civil' VP debate move dial in US election?
Today's Big Question 'Diametrically opposed' candidates showed 'a lot of commonality' on some issues, but offered competing visions for America's future and democracy
By Harriet Marsden, The Week UK Published
-
1 of 6 'Trump Train' drivers liable in Biden bus blockade
Speed Read Only one of the accused was found liable in the case concerning the deliberate slowing of a 2020 Biden campaign bus
By Peter Weber, The Week US Published
-
How could J.D. Vance impact the special relationship?
Today's Big Question Trump's hawkish pick for VP said UK is the first 'truly Islamist country' with a nuclear weapon
By Harriet Marsden, The Week UK Published
-
Biden, Trump urge calm after assassination attempt
Speed Reads A 20-year-old gunman grazed Trump's ear and fatally shot a rally attendee on Saturday
By Peter Weber, The Week US Published