Speed Reads

misinformation

Facebook 'did not live up to its promises to protect the U.S. elections,' report says

Facebook "did not live up to its promises to protect the U.S. elections" in 2020 and could have prevented pages that shared misinformation from amassing about 10 billion views, a new report found.

The advocacy group Avaaz in a report found that 267 pages and groups with a combined following of 32 million users spread "violence-glorifying content in the heat of the 2020 election," with 68.7 percent of these groups having "Boogaloo, QAnon or militia-aligned names" and posting content related to the extremist movements. "Despite clear violations of Facebook's policies," Avaaz said, 118 of those groups remain active on the platform, and they have almost 27 million followers.

Avaaz also found that the 100 most popular false or misleading stories related to the 2020 presidential election drew around 162 million views on Facebook, and 24 percent of these stories didn't come with a warning label from Facebook. Additionally, Avaaz said that "if the platform had acted earlier," it "could have stopped 10.1 billion estimated views of content from top-performing pages that repeatedly shared misinformation over the eight months" prior to the election.

"This report shows clearly how Facebook did not live up to its promises to protect the U.S. elections," Avaaz wrote, alleging the data points to Facebook's "role in providing fertile ground for and incentivizing a larger ecosystem of misinformation and toxicity, that we argue contributed to radicalizing millions and helped in creating the conditions in which the storming of the Capitol building became a reality."

Facebook is pushing back on Avaaz's report, with spokesperson Andy Stone telling Time it "distorts the serious work we've been doing to fight violent extremism and misinformation on our platform" and uses "flawed methodology to make people think that just because a Page shares a piece of fact-checked content, all the content on that Page is problematic." Stone added that Facebook's enforcement of its policies "isn't perfect" but that "we're always improving it while also working with outside experts to make sure that our policies remain in the right place."