Facebook's monstrous empire
A 'Wall Street Journal' series reveals how the platform's problems are directly produced by its operations and design
Facebook is at the center of yet another journalistic hurricane. The Wall Street Journal obtained a trove of internal Facebook documents, and used them for a series of articles about how rich celebrities get to break the company's rules with impunity (including posting apparent revenge porn), how Instagram has created an epidemic of mental health problems among young girls, how drug cartels and human traffickers have used Facebook openly to run their operations, how company staff know perfectly well its algorithm fuels hate and extremism, and how the company's systems are so toxic and broken that even Mark Zuckerberg himself couldn't use it effectively to promote vaccination.
This reporting proves beyond any doubt that Facebook is a menace that cannot be reformed from the inside. All the root causes of these problems are directly produced by how the company is designed and operated. The Facebook empire needs to be broken up and its pieces strictly regulated.
One big source source of trouble is the Newsfeed algorithm, which has been repeatedly redesigned to get users to spend more time on Facebook. By 2017 the company was looking at a long-term decline in use among rich countries, and tried various strategies to reverse the trend. It turns out that the easiest way to do this is to reward inflammatory content, incentivize anger and hostility, and encourage fighting in the comments section. This worked at retaining users, but at the cost of sowing bitterness, division, paranoia, and extremism across the globe. Political parties from Poland to Spain to Latin America complained to Facebook that the changes incentivized polarization and extremism, the Journal reports.
Instagram (which is owned by Facebook) has a similar problem. The basic idea of that platform is to watch glamorous people post carefully-composed and -edited pictures about how great their lives are. Unfortunately this comes with a downside: It tends to make ordinary people who don't have plastic surgeons, a full-time makeup team, a private jet, and the ability to spend six hours a day exercising feel bad about their bodies — particularly young girls, who already faced heavy social pressure to conform to a deliberately impossible beauty standard even before social media came along. Sure enough, the Journal reports that Facebook has known for years that Instagram was mass-producing anxiety, depression, and eating disorders among teen girls who use it, and did nothing about it.
That's because giving teens eating disorders is very profitable. As Casey Johnston writes, "these companies know that it's addictive to make people think that, somewhere in their app, there's a solution to feeling inferior and incomplete. The influencer who makes you feel not pretty enough, who also seems to have the key to becoming pretty enough? That's Instagram candy." One Instagram employee admitted as much on a company forum. "Isn't that what IG is mostly about? [looking] at the (very photogenic) life of the top 0.1%? Isn't that the reason why teens are on the platform?"
Yet another cause of problems is Facebook's basic business model. It makes money through scale — it can show ads to billions of people, but mainly through automated computer systems and user-generated content, and hence has only a comparatively small labor force. The reason it's made tens of billions of dollars in profits, making Mark Zuckerberg the fifth-richest person in the world, is because his company doesn't employ even a tiny fraction of the workers that would be required to properly moderate the colossal firehose of content on the platform. Doing so is likely not even possible with how much Facebook makes in revenue.
Finally (and relatedly) is Facebook's sheer size. The Journal reports that criminals of all kinds have used its services openly. One grim example came from a drug cartel's Instagram account, which posted a "video of a person with a gold pistol shooting a young man in the head while blood spurts from his neck. The next post is a photo of a beaten man tied to a chair; the one after that is a trash bag full of severed hands … The page, along with other Instagram and Facebook pages advertising the cartel, remained active for at least five months before being taken down." The reason this didn't get taken down quickly is Facebook reportedly doesn't hire many people who speak languages other than English, and reportedly doesn't care about poor countries because they have little ability to raise a fuss. "Facebook treats harm in developing countries as 'simply the cost of doing business' in those places, said Brian Boland, a former Facebook vice president," write the Journal's Justin Scheck, Newley Purnell, and Jeff Horwitz.
By the same token, the Journal reports that Zuckerberg earnestly tried to encourage Facebook users to get vaccinated, but he was easily defeated by a swarm of anti-vaccine maniacs who knew how to exploit the company's systems better than he did. "Even when he set a goal, the chief executive couldn't steer the platform as he wanted," write Sam Schechner, Jeff Horwitz, and Emily Glazer. Instead Facebook was overrun with anti-vaccine shouting, and the company has been ineffectually playing whack-a-mole against them instead of encouraging vaccination.
Over and over, it's the same story. Facebook is obsessively focused on ways to get people to spend more time on its services so it can sell more ads and make more money, and it basically doesn't care at all when the those strategies dissolve the social fabric or fuel a genocide. Throughout the Journal articles it is clear that the company brass is far more worried about avoiding negative publicity and appearing to be concerned about these problems than actually doing anything to solve them, because that would harm its bottom line.
So what is to be done?
First, break up the Facebook empire. Make it divest Instagram, Whatsapp, and Oculus. This probably won't accomplish much on its own, given that each individual company would still face the same incentives, but it would cut into the wealth and power of the Facebook brass and reduce one company's ability to control so much of the internet space by coordinating its systems.
Second, regulate social media companies. I have previously argued that repealing Section 230 would be a good step. This would make big companies like Facebook liable for the content posted on its platform, and hence force them to moderate heavily. Very likely the big platforms would become super cautious about what they allow to be posted, like broadcast television networks. Controversial political discussion would move back to smaller places that could afford to pay moderation teams, or recruit volunteers to do it, like forums of old or Reddit today.
We could take further steps — for instance, ban targeted advertising and create an online right to privacy. Turning half the internet into a panopticon surveillance machine so Silicon Valley can sell personalized ads is objectionable even if Facebook weren't such a monstrous company.
There are no doubt many ways to regulate Facebook and force it to stop spewing social poison into the collective commons. We just must realize that it will never, ever happen if the company is left to its own devices.