Behind Facebook's manipulation: Its Data Science team worked with little oversight

The experiment manipulating emotions is just a drop in the bucket

Published July 3, 2014 5:00PM (EDT)

This July 2, 2013 file photo shows Facebook Chief Operating Officer Sheryl Sandberg speaking during Global Women Leadership Summit in Tokyo.    ((AP Photo/Shizuo Kambayashi))
This July 2, 2013 file photo shows Facebook Chief Operating Officer Sheryl Sandberg speaking during Global Women Leadership Summit in Tokyo. ((AP Photo/Shizuo Kambayashi))

Over the weekend it was revealed that for a week back in 2012, Facebook manipulated the news feeds of nearly 700,000 users for a psychology experiment.

The Wall Street Journal delved into the researchers behind the experiment, Facebook's Data Science team. Since 2007, according to the report, the Data Science team has run hundreds of experiments, with little oversight.

The recently exposed experiment showed that if there were more negative posts on a news feed, the person's own posts were more likely to be negative. The same was true for an excess of positive stories on a news feed. Basically Facebook had the power to make users happier, or more disturbingly, elicit negative emotions.

The experiment's ethics have been questioned, a researcher apologized for the "anxiety" caused by the experiment and even Facebook's COO Sheryl Sandberg has spoken about the experiment, saying it was "poorly communicated."

The public's response: outrage. And understandably so. Users have put up with all manner of Facebook frustration, including petty changes in layout, dumb games, advertisements, alterations to privacy, use of data -- but this experiment takes the cake.

This is just one of the many experiments run by the Data Science team at Facebook, and it only came under public scrutiny due to the fact that it was published in an academic journal. And it's almost a good thing that it was, as this little known group at Facebook may have continued without user knowledge or oversight. The Wall Street Journal reports:

"Until recently, the Data Science group operated with few boundaries, according to a former member of the team and outside researchers. At a university, researchers likely would have been required to obtain consent from participants in such a study. But Facebook relied on users' agreement to its Terms of Service, which at the time said data could be used to improve Facebook's products. Those terms now say that user data may be used for research.

"'There's no review process, per se,' said Andrew Ledvina, a Facebook data scientist from February 2012 to July 2013. 'Anyone on that team could run a test,' Mr. Ledvina said. 'They're always trying to alter peoples' behavior.'

"He recalled a minor experiment in which he and a product manager ran a test without telling anyone else at the company. Tests were run so often, he said, that some data scientists worried that the same users, who were anonymous, might be used in more than one experiment, tainting the results.

"Facebook said that since the study on emotions, it has implemented stricter guidelines on Data Science team research. Since at least the beginning of this year, research beyond routine product testing is reviewed by a panel drawn from a group of 50 internal experts in fields such as privacy and data security. Facebook declined to name them."

Other tests have included a published study looking into "how families communicate," the Wall Street Journal reports. Studies looked at loneliness, how social behaviors move through social networks and a 2010 study on "political mobilization messages" and voting.

The company told the Wall Street Journal that it is considering additional changes to its practices.

Adam Kramer, who holds a Ph.D. in social psychology from the University of Oregon, said in a 2012 interview that Facebook was "the largest field study in the history of the world."

Kramer is one of the researchers who ran the recently revealed emotions experiment -- and has since responded to it. According to the Wall Street Journal, he said that in academia you must wait to be published and then have someone take notice, but at Facebook "I just message someone on the right team and my research has an impact within weeks, if not days."

This must be quite the allure for the often doctorate-holding researchers at Facebook. But it is just another chilling reminder of how much power companies like Facebook have over users, and how much they abuse this data. The data being manipulated is that of real human beings -- users, customers (if you consider our data a transaction with Facebook) of a product. And the lack of verbal consent (not just a Terms of Service agreement) or use of the Common Rule is disconcerting. The response from the parties involved drips with a complete lack of awareness of the invasive nature of the experiment.

Facebook may want to build a better product, reach more users, sell more ads, but that does not give license to treat users with a complete lack of respect.

This is apparently a practice not limited to Facebook, according to Kate Crawford, who is a researcher at Microsoft and a visiting professor at the Massachusetts Institute of Technology's Center for Civic Media. She told the Wall Street Journal that companies "really do see users as a willing experimental test bed."

h/t Wall Street Journal


By Sarah Gray

Sarah Gray is an assistant editor at Salon, focusing on innovation. Follow @sarahhhgray or email sgray@salon.com.

MORE FROM Sarah Gray