Lately I've taken to joking to friends that the only terrorism I could support is blowing up Facebook's physical data centers, provided you could somehow do it without hurting anyone. I know, I know — casualties aren't the only way this destruction could harm people. Jobs would be lost, there would be unintended consequences I wouldn't like, and so on. I don't actually want these facilities to explode. But I do want Facebook to go away forever. I have come to believe it is unfixable.
Consider the tweak Facebook founder and CEO Mark Zuckerberg announced Wednesday: The site will permanently stop suggesting political groups to users, and the Facebook team is looking for other ways to reduce political content in users' feeds as well. "We will still let people engage in political groups and discussion if they want to," Zuckerberg said. But "[w]hat we are hearing is that people don't want politics and fighting to take over their experience on our service." Facebook "can potentially do a better job," he added, of which this change is only one step in a plan the network will gradually globalize.
It's better than nothing, I suppose. But I question how much this sort of thing can accomplish. The most extreme political groups, like those linked to the QAnon conspiracy theory, will constantly adapt their terminology to get around automated filters. More seriously, there's no bright line between the political and nonpolitical. There was a brief hullabaloo last year when Goodyear, the tire maker, banned employee campaigning and other political advocacy at work but carved out an exception for "equity issues." An internal presentation indicated this means MAGA hats and pro-police shirts are out but Black Lives Matter and LGBT pride gear are in. Is that a distinction between two different political perspectives, as conservatives argued, or, as Goodyear said, between politics and equity? Facebook will have to decide.
Subscribe to The Week
Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.
Or see the controversy that occasioned (and followed) Vox's report this week on progressive parents' horror that a baby sleep expert who saved their sanity also donated to the campaign of former President Donald Trump. "A lot of criticism in mom groups, when someone posts something political, is like, 'We're here to talk about breastfeeding, not politics,'" one of the parents told Vox. "But motherhood is political. You're building a path for children to have a future." Like many, I would have taken baby sleep tips from Trump himself in those first few months of infancy if his advice worked. Still, this comment is correct about political creep, including on social media. Lots of mom groups are probably getting quite political right now by discussing this story. Will Facebook stop recommending them?
Yet even supposing Facebook has the ethics and tech to answer all these questions to universal satisfaction, I remain unimpressed by the scope of this approach. Drastic measures — say, doing away with sharing, or making links unclickable (like Instagram does), or limiting users to a single post per day — could accomplish more. But those measures won't happen, precisely because they begin to get at the root of the matter, which is not any one feature but Facebook itself.
The incentives are all wrong. Facebook runs on human emotion, and its single most efficient fuel is not the friendly connections it ostensibly exists to foster. It is political rage, the one sort of spleen for which our society consistently sanctions public venting.
Like all ad-funded social networks, Facebook makes money not by the mere fact of user accounts existing but by actual use. It is therefore engineered to produce frequent, active use — not merely passive browsing, but clicks and comments and shares. The experience is gamified. It trains our brains. We are simple beings who sincerely enjoy those accumulating likes. We enjoy even more the rush that inflammatory political content brings, and because a lot of politics, even now, is quite boring, disinformation and half-truths are best equipped to consistently provide the excitement we crave. Facebook wants clicks, and lies get clicks. Bad faith gets clicks. Anything that sets our lizard brains aflame gets clicks.
At a conscious level we may not enjoy this dynamic, as Zuckerberg himself acknowledged Wednesday. "There has been this frenzy across society where a lot of things have become political, and politics have been creeping into everything," he said. "And we have seen that people don't want that. They come [to Facebook] to connect with friends and family."
I think that's true. But I also think this volitional want usually loses to the unconscious habits and impulses Facebook and other networks like it ingrain within us. To resist those patterns — to get off Facebook or narrow one's use to a few healthy purposes that build up real-life community instead of detracting from it — is difficult. (I speak from experience.) The second and third quarters of 2020 saw daily active Facebook use slightly fall in the United States, a move in the right direction, but Facebook is not designed to be used responsibly.
I'm not sure whether Zuckerberg realizes (or allows himself to realize) his website is more than a potential venue for the political creep and "unfortunate decline in [real life] community participation" he laments. But the reality is Facebook itself is an exacerbating factor. The one sure fix is to shut it down for good.
Continue reading for free
We hope you're enjoying The Week's refreshingly open-minded journalism.
Subscribed to The Week? Register your account with the same email as your subscription.