As many as 14 million Facebook users may have published posts in May that they thought were private but were actually made public, thanks to a software bug that changed their default audience.
Erin Egan, Facebook's chief privacy officer, announced in a statement on Thursday that the bug "did not impact anything people had posted before — and they could still choose their audience just as they always have." She said the issue has been resolved, and those who were affected will be notified so they can review the audience of their posts published between May 18 and May 27. Catherine Garcia
White supremacy, white nationalism, white separatism — with all the neo-Nazis and alt-right personalities in the news these days, it can be hard to keep the terms straight. Facebook, though, has determined that two of the three are just fine in its book, Motherboard reports.
The social media giant has faced recent outcry regarding its censorship of hate speech — or lack thereof. Earlier this year, ProPublica revealed that Facebook trains its censors to recognize "white men" as a protected category, although "black children" are not.
In new slides obtained by Motherboard and published Friday, Facebook has apparently gone as far as to determine that it is a-okay to say "the U.S. should be a white-only nation," but if you say "I am a white supremacist," you have crossed a line.
— Jason Koebler (@jason_koebler) May 25, 2018
Keep trying. Jeva Lange
Executives at Facebook are at odds over how to best respond to the spread of disinformation on the platform, several current and former Facebook employees told The New York Times.
The Times reports that Alex Stamos, Facebook's chief security officer, is leaving the company by August because of the tension. Stamos has been vocal about how important it is for the public to know how Russians used Facebook to spread fake news and propaganda before the 2016 presidential election, the current and former employees said, but he's been met with resistance from other leaders, primarily on the legal and policy teams.
Stamos came to Facebook from Yahoo in 2015, and in June 2016, he had engineers start to look for suspicious Russian activity on Facebook. By November, they found evidence of Russian operatives pushing leaks from the Democratic National Committee, the Times reports, but that same month, Facebook founder and chief executive Mark Zuckerberg said it was a "pretty crazy idea" to think Russia influenced the election. More evidence was found by the spring of 2017, leading to internal arguments between Stamos, who wanted to disclose as much information as possible, and others like Elliot Schrage, vice president of communications and policy, who did not want to share anything without more "ironclad" evidence, the Times reports.
In a statement, Stamos said these are "really challenging issues," and he's had "some disagreements" with his colleagues. In response to the Times' story, he tweeted that he's "still fully engaged with my work at Facebook," and is "spending more time exploring emerging security risks and working on election security." You can read more on the backlash to Facebook's secrecy and the internal arguments at The New York Times. Catherine Garcia
In May 2016, Gizmodo published an article that alleged "Facebook workers routinely suppressed news stories of interest to conservative readers from the social network's influential 'trending' news section." Two months out from the Republican National Convention, Donald Trump was the clear favorite for the nomination, and the article "went off like a bomb in Menlo Park," Wired reports in a massive article published Monday on the internal decisions at Facebook over the course of the election.
But Facebook's biggest concern was an unhappy Sen. John Thune (R-S.C.), who chairs the Senate Commerce Committee, which oversees the Federal Trade Commission. In addition to sending Thune a 12-page investigation that found Gizmodo's story was factually inaccurate, "Facebook decided, too, that it had to extend an olive branch to the entire American right wing," Wired writes. The resulting conference found Mark Zuckerberg and Sheryl Sandberg meeting with more than a dozen leading conservatives in May to "build trust."
From the get-go, though, "the company wanted to make a show of apologizing for its sins," Wired writes, and the meeting was allegedly engineered to be a mostly fruitless exercise:
According to a Facebook employee involved in planning the meeting, part of the goal was to bring in a group of conservatives who were certain to fight with one another. They made sure to have libertarians who wouldn't want to regulate the platform and partisans who would. Another goal, according to the employee, was to make sure the attendees were "bored to death" by a technical presentation after Zuckerberg and Sandberg had addressed the group.
The power went out, and the room got uncomfortably hot. But otherwise the meeting went according to plan. The guests did indeed fight, and they failed to unify in a way that was either threatening or coherent. [Wired]
If you happened to read over the weekend that Megyn Kelly was fired from Fox News because she's backing Hillary Clinton, she wanted you to know on Tuesday's Kelly File that neither of those things are true and that she may be able to sue for libel. Don't worry, Mark Zuckerberg. "I have no desire to sue Facebook, nor anybody else, because I really don't like lawsuits," Kelly told her two lawyer guests. "Which is ironic, since I was a lawyer for nine years, but you know, that's what it does to you."
But she did strongly suggest that Facebook rehire its Trending curators — last weekend was the first time Facebook allowed its algorithms to run its Trending feature, sidelining the human editors. ("So once again, Mark, the rhythm method fails," she quipped to one of the lawyers, Mark Eiglarsh.) And she wanted to know if "a regular person who isn't on TV every night and doesn't have the ability to show people it's a lie" could sue Facebook, if Trending targeted him or her with a false story. Eiglarsh said probably not, because of having to prove "malicious intent," but fellow lawyer Andell Brown was more bullish on a lawsuit. Either way, expect Facebook to tinker with Trending again soon, and you can watch Kelly's discussion below. Peter Weber