Facebook's amoral algorithms

Disinformation, outrage, and polarization keep the clicks coming

Mark Zuckerberg.
(Image credit: JIM WATSON/AFP via Getty Images)

This is the editor's letter in the current issue of The Week magazine.

In the chaotic days after the 2020 election, Facebook employees warned CEO Mark Zuckerberg that the platform was being used to promote bogus claims of massive election fraud. With then-President Trump whipping up fury over a "rigged" election, Zuckerberg ordered that Facebook give new weight to existing "news ecosystem quality" scores, so that mainstream sources like major newspapers had priority in newsfeeds over extremist websites such as Breitbart. But the platform soon reverted to its old algorithm, spewing out election disinformation like a sewage spill. As former Facebook employee Frances Haugen told the world this week, the tech behemoth knows that outrage, anger, and conspiracy theories — what it internally calls "bad for the world" content — generate more emotion, engagement, and dopamine hits. "If they change the algorithm to be safer," Haugen said, "people will spend less time on the site, they'll click on less ads, they'll make less money."

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up
To continue reading this article...
Continue reading this article and get limited website access each month.
Get unlimited website access, exclusive newsletters plus much more.
Cancel or pause at any time.
Already a subscriber to The Week?
Not sure which email you used for your subscription? Contact us