shelved
May 26, 2020

Facebook reportedly found that its algorithms can make online polarization worse — but the company apparently didn't do much with that information.

That's according to a new report in The Wall Street Journal, which quotes a 2018 presentation from a Facebook team warning executives that "our algorithms exploit the human brain's attraction to divisiveness" and that "if left unchecked," Facebook would give users "more and more divisive content in an effort to gain user attention & increase time on the platform."

But Facebook executives including CEO Mark Zuckerberg "largely shelved the basic research" into polarization on the site and "weakened or blocked efforts to apply its conclusions to Facebook products," the report says.

Among the ideas reportedly discussed was to adjust the recommendation algorithms to show users a "wider range" of suggested groups, although a Facebook team reportedly said their suggestions to combat polarization might decrease engagement and be "antigrowth," so Facebook would have to "take a moral stance." There was reportedly internal concern about changes disproportionately affecting conservatives, as well.

The Journal report also cites a 2016 presentation from a Facebook researcher stating that "64 percent of all extremist group joins are due to our recommendation tools" and that "our recommendation systems grow the problem."

"Facebook is under fire for making the world more divided," the Journal writes. "Many of its own experts appeared to agree and to believe Facebook could mitigate many of the problems. The company chose not to."

A Facebook spokesperson told the Journal that the company has "built a robust integrity team, strengthened our policies and practices to limit harmful content, and used research to understand our platform's impact on society so we continue to improve." Read the full report at The Wall Street Journal. Brendan Morrow

See More Speed Reads