Facebook Fail
February 17, 2021

The Australian government is working on a law that would require large internet platforms like Google and Facebook to pay local journalism publishers for their content. In response, Facebook announced Wednesday that it would block all Australians from posting any news content of any kind, and would block the entire world population from posting any news content from Australian sources. Facebook executive William Easton justified the decision in a blog post. The proposed law "misunderstands the relationship between our platform and publishers who use it to share news content," he writes, and claims that Facebook provides Australian news publishers 5.1 billion free referrals.

The complaint is a crock. This is about money and power. Extensive and detailed reports from the Australian Competition and Consumer Commission shows that as digital ads have grown to account for about half of all the advertising spending in the country, Google and Facebook have crushed the market — taking up about two-thirds of all digital ad spending. In the display ads submarket, Facebook has moved from 25 percent of the market in 2014 to 62 percent in 2019.

In other words, Facebook may provide some referral traffic to publishers, but its (and Google's) structural domination over the ad marketplace is strangling Australian journalism in exactly the same way as is happening in the United States. The Big Tech behemoths are eating up most of the ad money, and journalism outfits are left fighting over a steadily-smaller pile of scraps. Today Facebook is attempting to intimidate the Australian government away from doing anything about those fat profits (and potentially disrupting Facebook's dictatorial control over what news Australians get to see).

As I have written before, Facebook is a truly monstrous company — one that has fueled genocide and racist violence. Here's hoping the Australian government tells Facebook to go pound sand in the Simpson Desert, and some enterprising Aussie makes a Facebook competitor without the extremism. Ryan Cooper

January 15, 2021

Facebook may have banned President Trump, but his followers are still gaming the site to spread election fraud conspiracies and downright dangerous disinformation.

Despite Facebook officials' attempts to play down the site's role in organizing last week's Capitol riot, it's clear plenty of Facebook groups and users spread conspiracies and even used the site to fill buses to Washington, D.C. Even after the site started cracking down on the organizers last week, at least 90 "Stop the Steal" groups have remained operating under altered names, while users exploit Facebook's features to spread disinformation other years, CNN reports via research from extremism experts at the activist group Avaaz.

Facebook instituted a blanket ban on "Stop the Steal" content earlier this week. But groups and users have quickly changed gears, rebranding their pages as "'Stop the Fraud' or 'Stop the Rigged Election' or 'Own the Vote,'" Avaaz campaign director Fadi Quran told CNN.

Stories, one of Facebook and its subsidiary Instagram's most popular features, has also helped far-right users spread disinformation undetected. Stories disappear after 24 hours, and Avaaz found accounts with "tens of thousands, if not hundreds of thousands of followers," are "inviting people to events such as the insurrection" using those temporary message boards, Quran said. And despite Facebook's claims that the site "does not profit from hate," BuzzFeed News found earlier this week that Facebook allowed firearms and military gear sellers to target ads to people involved in far-right and militia groups, even placing their ads right next to posts planning the uprising.

A Facebook spokesperson said the site banned three of the groups after being notified of their activity, and has cracked down on white supremacist and QAnon groups. Its recent ban on Stop the Steal content will take longer to ramp up, the spokesperson said. Read more at CNN. Kathryn Krawczyk

June 7, 2018

As many as 14 million Facebook users may have published posts in May that they thought were private but were actually made public, thanks to a software bug that changed their default audience.

Erin Egan, Facebook's chief privacy officer, announced in a statement on Thursday that the bug "did not impact anything people had posted before — and they could still choose their audience just as they always have." She said the issue has been resolved, and those who were affected will be notified so they can review the audience of their posts published between May 18 and May 27. Catherine Garcia

May 25, 2018

White supremacy, white nationalism, white separatism — with all the neo-Nazis and alt-right personalities in the news these days, it can be hard to keep the terms straight. Facebook, though, has determined that two of the three are just fine in its book, Motherboard reports.

The social media giant has faced recent outcry regarding its censorship of hate speech — or lack thereof. Earlier this year, ProPublica revealed that Facebook trains its censors to recognize "white men" as a protected category, although "black children" are not.

In new slides obtained by Motherboard and published Friday, Facebook has apparently gone as far as to determine that it is a-okay to say "the U.S. should be a white-only nation," but if you say "I am a white supremacist," you have crossed a line.

Keep trying. Jeva Lange

March 19, 2018

Executives at Facebook are at odds over how to best respond to the spread of disinformation on the platform, several current and former Facebook employees told The New York Times.

The Times reports that Alex Stamos, Facebook's chief security officer, is leaving the company by August because of the tension. Stamos has been vocal about how important it is for the public to know how Russians used Facebook to spread fake news and propaganda before the 2016 presidential election, the current and former employees said, but he's been met with resistance from other leaders, primarily on the legal and policy teams.

Stamos came to Facebook from Yahoo in 2015, and in June 2016, he had engineers start to look for suspicious Russian activity on Facebook. By November, they found evidence of Russian operatives pushing leaks from the Democratic National Committee, the Times reports, but that same month, Facebook founder and chief executive Mark Zuckerberg said it was a "pretty crazy idea" to think Russia influenced the election. More evidence was found by the spring of 2017, leading to internal arguments between Stamos, who wanted to disclose as much information as possible, and others like Elliot Schrage, vice president of communications and policy, who did not want to share anything without more "ironclad" evidence, the Times reports.

In a statement, Stamos said these are "really challenging issues," and he's had "some disagreements" with his colleagues. In response to the Times' story, he tweeted that he's "still fully engaged with my work at Facebook," and is "spending more time exploring emerging security risks and working on election security." You can read more on the backlash to Facebook's secrecy and the internal arguments at The New York Times. Catherine Garcia

February 12, 2018

In May 2016, Gizmodo published an article that alleged "Facebook workers routinely suppressed news stories of interest to conservative readers from the social network's influential 'trending' news section." Two months out from the Republican National Convention, Donald Trump was the clear favorite for the nomination, and the article "went off like a bomb in Menlo Park," Wired reports in a massive article published Monday on the internal decisions at Facebook over the course of the election.

But Facebook's biggest concern was an unhappy Sen. John Thune (R-S.C.), who chairs the Senate Commerce Committee, which oversees the Federal Trade Commission. In addition to sending Thune a 12-page investigation that found Gizmodo's story was factually inaccurate, "Facebook decided, too, that it had to extend an olive branch to the entire American right wing," Wired writes. The resulting conference found Mark Zuckerberg and Sheryl Sandberg meeting with more than a dozen leading conservatives in May to "build trust."

From the get-go, though, "the company wanted to make a show of apologizing for its sins," Wired writes, and the meeting was allegedly engineered to be a mostly fruitless exercise:

According to a Facebook employee involved in planning the meeting, part of the goal was to bring in a group of conservatives who were certain to fight with one another. They made sure to have libertarians who wouldn't want to regulate the platform and partisans who would. Another goal, according to the employee, was to make sure the attendees were "bored to death" by a technical presentation after Zuckerberg and Sandberg had addressed the group.

The power went out, and the room got uncomfortably hot. But otherwise the meeting went according to plan. The guests did indeed fight, and they failed to unify in a way that was either threatening or coherent. [Wired]

Read the full report about Facebook at Wired. Jeva Lange

August 31, 2016

If you happened to read over the weekend that Megyn Kelly was fired from Fox News because she's backing Hillary Clinton, she wanted you to know on Tuesday's Kelly File that neither of those things are true and that she may be able to sue for libel. Don't worry, Mark Zuckerberg. "I have no desire to sue Facebook, nor anybody else, because I really don't like lawsuits," Kelly told her two lawyer guests. "Which is ironic, since I was a lawyer for nine years, but you know, that's what it does to you."

But she did strongly suggest that Facebook rehire its Trending curators — last weekend was the first time Facebook allowed its algorithms to run its Trending feature, sidelining the human editors. ("So once again, Mark, the rhythm method fails," she quipped to one of the lawyers, Mark Eiglarsh.) And she wanted to know if "a regular person who isn't on TV every night and doesn't have the ability to show people it's a lie" could sue Facebook, if Trending targeted him or her with a false story. Eiglarsh said probably not, because of having to prove "malicious intent," but fellow lawyer Andell Brown was more bullish on a lawsuit. Either way, expect Facebook to tinker with Trending again soon, and you can watch Kelly's discussion below. Peter Weber

See More Speed Reads