- Facepalm June 30
On Sunday, Facebook sort of apologized for manipulating the news feeds of 689,003 randomly selected users, all for the purpose of science. For a week in January 2012, Facebook researchers secretly funneled either more positive or negative stories into the selected news feeds, then watched to see how the users reacted in their own posts.
The results, published in the journal Proceedings of the National Academy of Sciences earlier this month, are actually pretty interesting: Moods are contagious, even over social networks. Or as the researchers put it:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicated that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. [PNAS]
If the findings are interesting, the methodology is pretty controversial. Facebook argues that it has the right to do this under that terms of service agreement you didn't read when you signed up, but academic social scientists are supposed to get "informed consent" from the subjects. There was also some more gut-level revulsion at the idea of Facebook manipulating people's feelings — here's privacy activist Lauren Weinstein:
I wonder if Facebook KILLED anyone with their emotion manipulation stunt. At their scale and with depressed people out there, it's possible.— Lauren Weinstein (@laurenweinstein) June 29, 2014
After the PNAS study began to get noticed, Adam Kramer, the Facebook employee who conducted it with two researchers from Cornell and UC San Francisco, tried to explain himself on (where else?) Facebook:
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.... My coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. [Facebook]
The world's largest social network has long shaped what its users see: When you log in, Facebook shows you about 300 of the 1,500 items that might show up on your news feed, determined by a closely guarded algorithm. "Facebook didn't do anything illegal, but they didn't do right by their customers," Gartner analyst Brian Blau tells The New York Times. Caveat emptor.- -
THE WEEK'S AUDIOPHILE PODCASTS: LISTEN SMARTER
- 7 things the world's happiest people do every day
- How U.S. special forces are preparing for the worst-case scenario in North Korea
- Why you should really take a nap this afternoon, according to science
- Why Israel can no longer let the Palestinian Authority be responsible for security in the West Bank
- Why you shouldn't eat dog. Not even once.
- What would a U.S.-Russia war look like?
- Here's the schedule very successful people follow every day
- 7 grammar rules you really should pay attention to
- Grammar quiz: Do you know the passive voice?
- The safest seats are at the back of the plane — and 5 other surprising facts about airline crashes
Subscribe to the Week