On Sunday, Facebook sort of apologized for manipulating the news feeds of 689,003 randomly selected users, all for the purpose of science. For a week in January 2012, Facebook researchers secretly funneled either more positive or negative stories into the selected news feeds, then watched to see how the users reacted in their own posts.
The results, published in the journal Proceedings of the National Academy of Sciences earlier this month, are actually pretty interesting: Moods are contagious, even over social networks. Or as the researchers put it:
When positive expressions were reduced, people produced fewer positive posts and more negative posts; when negative expressions were reduced, the opposite pattern occurred. These results indicated that emotions expressed by others on Facebook influence our own emotions, constituting experimental evidence for massive-scale contagion via social networks. [PNAS]
If the findings are interesting, the methodology is pretty controversial. Facebook argues that it has the right to do this under that terms of service agreement you didn't read when you signed up, but academic social scientists are supposed to get "informed consent" from the subjects. There was also some more gut-level revulsion at the idea of Facebook manipulating people's feelings — here's privacy activist Lauren Weinstein:
After the PNAS study began to get noticed, Adam Kramer, the Facebook employee who conducted it with two researchers from Cornell and UC San Francisco, tried to explain himself on (where else?) Facebook:
We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook.... My coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. [Facebook]
The world's largest social network has long shaped what its users see: When you log in, Facebook shows you about 300 of the 1,500 items that might show up on your news feed, determined by a closely guarded algorithm. "Facebook didn't do anything illegal, but they didn't do right by their customers," Gartner analyst Brian Blau tells The New York Times. Caveat emptor.