Facebook's thought police
Do you really want the social network's information cops deciding what the truth is?
The social panic and media hysteria over fake news continues unabated. And once again, Facebook's reaction is all wrong.
The left's intense focus on false news stories exploded in the wake of what seemed like an inexplicable Republican victory in the 2016 election, with Donald Trump beating Hillary Clinton despite an avalanche of bad press directed at the former, especially in the final weeks of the campaign. The GOP also unexpectedly retained control of the Senate, winning surprise victories in Wisconsin and Indiana to confound the Democratic Party's advantage in incumbent seats and normal presidential-cycle turnout models.
Attention quickly focused on a BuzzFeed story about traffic generated by stories on Facebook, and how much more popular fake news was over real news. The analysis had significant flaws, however, beginning with the fact that there was no evidence of correlation between Facebook traffic and voting behavior, let alone causation. Furthermore, the top five "real news" articles in the analysis turned out to be four opinion essays opposing Trump and the New York Post's story on Melania Trump's suggestive modeling pictures from two decades earlier.
Nonetheless, much of the left and the mainstream media stuck with the fake news narrative to explain the election outcome, and have continued to pressure Facebook to take action against it.
That obsession backfires occasionally, as The New York Times discovered last week when it attempted to demonstrate Trump's unpopularity by comparing pictures on Twitter from a visit by the NFL champion New England Patriots to a similar 2015 visit during Barack Obama's presidency. The Patriots objected to that characterization, replying on Twitter with pictures of their own demonstrating that the two events had similar turnout. Times sports editor Colin Campbell issued a retraction the next day, saying, "I'm an idiot."
Now obviously, there's a difference between fake news that maliciously intends to deceive and flawed news that well-meaning journalistic organizations just mess up — but whining about fake news makes your flawed news a bigger problem than it otherwise would be.
Nonetheless, Facebook is convinced it has a fake news problem. And it's scrambling to fix it.
Facebook founder Mark Zuckerberg once scoffed at the allegation that Facebook news feeds were a factor in the decision process for voters, accusing those pushing that theory of "a certain profound lack of empathy" with Facebook's consumers. Last week, however, Fortune reported that Zuckerberg had "come around" on the fake news crisis, and announced new algorithms to challenge the credibility of items on news feeds, partnering with fact checkers, and having users drive the process of disputing news stories. He pointed to Facebook's earlier success in reducing the amount of clickbait in news feeds — stories with sensational headlines that rarely deliver on their promise — and pledged to adapt that approach to fake news.
However, Facebook's chief operating officer rejected calls for the company to set itself up as a kind of information police force. Sheryl Sandberg reminded critics that they do not publish news, but only provide a platform for their users to share it. "We definitely don't want to be the arbiter of the truth," Sandberg told the BBC. "We don't think that's appropriate for us. … Newsrooms have to do their part, media companies, classrooms, and technology companies."
Sandberg leaves out an important player in that equation: the consumer. If consumers want better news, then they need to seek it out. Consumers should not rely on a community-driven news feed for their information, but instead seek out original sources, determine which they can trust, and then verify information before sharing it.
This is not a new problem, and it didn't originate with Facebook. Before the advent of social media, email was the favored medium for fever-swamp claims and conspiracy theories. Since the advent of the internet, websites of varying degrees of sophistication have amplified the silly and the serially stupid. Before that, supermarket tabloids specialized in what's now called fake news — and both still operate as distribution channels for fake news to this day, in the same manner as social media.
And yet, America still held elections over the last few decades with all of these sources of fake news, and did so successfully. Why? Because despite the attempts to paint the U.S. electorate as a bunch of unsophisticated hicks, most adults have no problem distinguishing fake news from the real thing. Voters have more resources than ever to help them consume news responsibly. They don't need Facebook to pre-digest their news and then spoon-feed it to them.
Facebook is a private-sector, voluntary-association community, and they can set up their system as they like. If they want to police news stories and block access to some based on their own assessment of credibility, that's their choice. It comes, however, at the expense of choice among their members, and in the most paternalistic manner conceivable.
That might have some Facebook consumers exercising choices in ways that Facebook won't like.