don't rock the boat
YouTube ignored warnings about toxic content as it pursued higher engagement, report says
YouTube employees for years raised concerns about the spread of toxic content on the platform only to be ignored, Bloomberg reported on Tuesday.
"Scores of people" within the company "raised concerns about the mass of false, incendiary and toxic content" spreading on YouTube, the report says, citing interviews with more than 20 current and former employees. This has long been an issue on YouTube, where videos espousing false conspiracy theories are routinely not only uploaded but spread like wildfire in part thanks to the site's recommendation algorithm.
But for some time, those who raised these concerns, Bloomberg writes, "got the same basic response: Don't rock the boat." One former executive said that lawyers even told non-moderator employees to "avoid searching on their own for questionable videos" in an apparent attempt to shield the company from liability.
Some solutions that were proposed included the idea of removing questionable videos that were "close to the line" from the recommendation algorithm, but this was denied. Others recommended the YouTube Kids app only show videos that were hand-picked by YouTube in order to prevent harmful content from making its way in, but this suggestion was also turned down. When a video claiming the Parkland school shooting victims were actors made its way to the YouTube trending page, some recommended limiting recommendations on this page to trusted news sources, but once again, the idea was shot down at the time, Bloomberg says. The report suggests YouTube ignored these warnings in pursuit of boosting engagement.
YouTube has since started to take some additional actions, ultimately deciding, for example, to stop recommending borderline videos, the idea that had been rejected years earlier. The employee whose idea that was noted, "I can say with a lot of confidence that they were deeply wrong" to ignore his proposal at the time. Read the full report at Bloomberg.