Facebook's internal research warned about polarization — but executives 'weakened or blocked' efforts to combat it

Mark Zuckerberg
(Image credit: GERARD JULIEN/AFP via Getty Images)

Facebook reportedly found that its algorithms can make online polarization worse — but the company apparently didn't do much with that information.

That's according to a new report in The Wall Street Journal, which quotes a 2018 presentation from a Facebook team warning executives that "our algorithms exploit the human brain's attraction to divisiveness" and that "if left unchecked," Facebook would give users "more and more divisive content in an effort to gain user attention & increase time on the platform."

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

Among the ideas reportedly discussed was to adjust the recommendation algorithms to show users a "wider range" of suggested groups, although a Facebook team reportedly said their suggestions to combat polarization might decrease engagement and be "antigrowth," so Facebook would have to "take a moral stance." There was reportedly internal concern about changes disproportionately affecting conservatives, as well.

The Journal report also cites a 2016 presentation from a Facebook researcher stating that "64 percent of all extremist group joins are due to our recommendation tools" and that "our recommendation systems grow the problem."

"Facebook is under fire for making the world more divided," the Journal writes. "Many of its own experts appeared to agree and to believe Facebook could mitigate many of the problems. The company chose not to."

A Facebook spokesperson told the Journal that the company has "built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve." Read the full report at The Wall Street Journal.

Explore More
Brendan Morrow

Brendan worked as a culture writer at The Week from 2018 to 2023, covering the entertainment industry, including film reviews, television recaps, awards season, the box office, major movie franchises and Hollywood gossip. He has written about film and television for outlets including Bloody Disgusting, Showbiz Cheat Sheet, Heavy and The Celebrity Cafe.