Speed Reads

a recommended change

YouTube says it'll try to stop recommending so many conspiracy theory videos

YouTube says it will adjust its algorithm to crack down on conspiracy theory videos that may be harmful to users.

The site in a blog post Friday said that it will "begin reducing recommendations of borderline content and content that could misinform users in harmful ways," such as flat Earth theories or 9/11 conspiracy theory videos. YouTube won't actually remove anything, but this algorithm change would affect how often the content pops up as a recommendation for a user or in the "next up" tab after they're finished watching a video.

It doesn't seem that YouTube will never recommend videos like these under any circumstances, though, as the site only says it's going to "limit" the recommendations and saying that "these videos may appear in recommendations for channel subscribers and in search results."

Still, this announcement was seen as a necessary move for the company, especially after a BuzzFeed News investigation published Thursday, which showed how YouTube can direct users to extremist or conspiracy theory videos after they've watched legitimate news. For example, watching a BBC News video of a speech by House Speaker Nancy Pelosi (D-Calif.) eventually leads a user to videos about the QAnon conspiracy theory. Charlie Warzel, one of the reporters behind the BuzzFeed report, wrote Friday that YouTube's algorithm change seems "like a really meaningful step forward," although noting that "no credit is due until real people see meaningful changes."