YouTube ignored warnings about toxic content as it pursued higher engagement, report says

Youtube on computer.
(Image credit: Ozan Kose/Stringer/Getty Images)

YouTube employees for years raised concerns about the spread of toxic content on the platform only to be ignored, Bloomberg reported on Tuesday.

"Scores of people" within the company "raised concerns about the mass of false, incendiary and toxic content" spreading on YouTube, the report says, citing interviews with more than 20 current and former employees. This has long been an issue on YouTube, where videos espousing false conspiracy theories are routinely not only uploaded but spread like wildfire in part thanks to the site's recommendation algorithm.

But for some time, those who raised these concerns, Bloomberg writes, "got the same basic response: Don't rock the boat." One former executive said that lawyers even told non-moderator employees to "avoid searching on their own for questionable videos" in an apparent attempt to shield the company from liability.

Subscribe to The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

Some solutions that were proposed included the idea of removing questionable videos that were "close to the line" from the recommendation algorithm, but this was denied. Others recommended the YouTube Kids app only show videos that were hand-picked by YouTube in order to prevent harmful content from making its way in, but this suggestion was also turned down. When a video claiming the Parkland school shooting victims were actors made its way to the YouTube trending page, some recommended limiting recommendations on this page to trusted news sources, but once again, the idea was shot down at the time, Bloomberg says. The report suggests YouTube ignored these warnings in pursuit of boosting engagement.

YouTube has since started to take some additional actions, ultimately deciding, for example, to stop recommending borderline videos, the idea that had been rejected years earlier. The employee whose idea that was noted, "I can say with a lot of confidence that they were deeply wrong" to ignore his proposal at the time. Read the full report at Bloomberg.

To continue reading this article...
Continue reading this article and get limited website access each month.
Get unlimited website access, exclusive newsletters plus much more.
Cancel or pause at any time.
Already a subscriber to The Week?
Not sure which email you used for your subscription? Contact us
Brendan Morrow

Brendan worked as a culture writer at The Week from 2018 to 2023, covering the entertainment industry, including film reviews, television recaps, awards season, the box office, major movie franchises and Hollywood gossip. He has written about film and television for outlets including Bloody Disgusting, Showbiz Cheat Sheet, Heavy and The Celebrity Cafe.