Facebook put coronavirus misinformation warnings on about 40 million posts in March

Facebook.
(Image credit: OLIVIER DOULIERY/AFP via Getty Images)

Facebook has announced a new policy surrounding coronavirus misinformation on its platform after applying warning labels to tens of millions of posts last month alone.

The social media platform on Thursday said that in March, it put warning labels on roughly 40 million posts containing coronavirus misinformation based on ratings from fact-checkers. "When people saw those warning labels, 95 percent of the time they did not go on to view the original content," Facebook said. Additionally, Facebook disclosed that it removed "hundreds of thousands of pieces of misinformation that could lead to imminent physical harm," such as "harmful claims like drinking bleach cures the virus."

Going forward, Facebook will start showing messages in the Facebook feeds of users "who have liked, reacted or commented on harmful misinformation about COVID-19 that we have since removed." Users will be directed toward information about myths surrounding COVID-19 that have been debunked by the World Health Organization.

The Week

Escape your echo chamber. Get the facts behind the news, plus analysis from multiple perspectives.

SUBSCRIBE & SAVE
https://cdn.mos.cms.futurecdn.net/flexiimages/jacafc5zvs1692883516.jpg

Sign up for The Week's Free Newsletters

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

From our morning news briefing to a weekly Good News Newsletter, get the best of The Week delivered directly to your inbox.

Sign up

"We want to connect people who may have interacted with harmful misinformation about the virus with the truth from authoritative sources in case they see or hear these claims again off of Facebook," said Guy Rosen, Facebook's vice president of integrity.

An example screenshot shows a news feed where a user is encouraged to share a link to the WHO website with a list of common coronavirus rumors. A Facebook spokesperson told Axios the company is still testing different possible versions of what the notifications to users who engaged with misinformation could look like.

This announcement, Politico notes, comes after a campaign group said more than 40 percent of misinformation it found related to the coronavirus on Facebook was remaining on the platform even after being debunked. The new policy of informing users who have engaged with misinformation will take effect over the next few weeks.

Explore More
Brendan Morrow

Brendan worked as a culture writer at The Week from 2018 to 2023, covering the entertainment industry, including film reviews, television recaps, awards season, the box office, major movie franchises and Hollywood gossip. He has written about film and television for outlets including Bloody Disgusting, Showbiz Cheat Sheet, Heavy and The Celebrity Cafe.