Facebook announced Thursday its plans to address the spread of fake news, an apparent response to critics who have blamed the site for making it easy to peddle misinformation or even swing the election. The social media network has long been cautious about interfering with what gets shared, not wanting to be, as The New York Times puts it, "an arbiter of truth." But while "we really value giving people a voice … we also believe we need to take responsibility for the spread of fake news on our platform," said Adam Mosseri, a Facebook vice president.
Facebook plans to experiment with gutting fake news websites' revenue by scanning third-party links for pages that are mainly advertising, or are pretending to be another website, like a fake New York Times. Users will also be able to report posts that appear to be fake:
Users can currently report a post they dislike in their feed, but when Facebook asks for a reason, the site presents them with a list of limited or vague options, including the cryptic "I don't think it should be on Facebook." In Facebook's new experiment, users will have a choice to flag the post as fake news and have the option to message the friend who originally shared the piece to let them know the article is false.
If an article receives enough flags as fake, it can be directed to a coalition of groups that would perform fact-checking, including Poynter, Snopes, PolitiFact and ABC News, among others. Those groups will check the article and can mark it as a "disputed" piece, a designation that will be seen on Facebook.
Disputed articles will ultimately appear lower in the News Feed. If users still decide to share disputed articles, they will receive pop-ups reminding them that the accuracy of the piece is in question. [The New York Times]
Opinion pieces, or satirical ones like those published by The Onion, will not be affected by the changes, Facebook said.
"The fake cat is already out of the imaginary bag," explained Emily Bell, the director at the Tow Center for Digital Journalism at Columbia University. "If [Facebook] didn't try and do something about it, next time around it could have far worse consequences."