Facebook is kind of sorry it experimented on you

A clarification to a controversial research project

By Mary Elizabeth Williams

Senior Writer

Published June 30, 2014 2:19PM (EDT)

  (AP/Ben Margot)
(AP/Ben Margot)

We all already know that Facebook and emotional manipulation go together like macaroni and cheese. But when the company revealed late last week that in 2012, it had tinkered the news feeds of nearly 700,000 randomly selected users as part of a real research experiment, the reaction was swift and negative. So on Sunday, the lead researcher and data scientist Adam Kramer – also known on Facebook as "Danger Muffin" -- issued a clarification with a bit of apology thrown in.

In the study, researchers tweaked user feeds to highlight either more positive or more negative posts. The study concluded that "emotional states can be transferred to others via emotional contagion, leading people to experience the same emotions without their awareness." Which, when you put it that way, does sound pretty damn creepy. And what made the idea that Facebook is using our posts to mess around with our heads perhaps most unnerving to many was the way it once again plainly told users that all of us who've agreed to the company's terms of service have signed on for just this kind of mood mining.

The news was just the latest in a perpetual series of controversies over how FB uses its members and the access it has to their information. Earlier this year, there was concern when the company adjusted its permissions to potentially access Android users' texts. And this past spring it faced an even bigger outcry over a proposed handy microphone feature for its smartphone app that had the potential to listen to what users were doing while composing a status update – ostensibly to identify and share music or entertainment but also, as anyone with a life can imagine, chock-full of horribly privacy wrecking possibilities.

In his explanatory post, Kramer said that "The reason we did this research is because we care about the emotional impact of Facebook and the people that use our product. We felt that it was important to investigate the common worry that seeing friends post positive content leads to people feeling negative or left out. At the same time, we were concerned that exposure to friends' negativity might lead people to avoid visiting Facebook." And he added that the entire project involved "very minimally deprioritizing a small percentage of content in News Feed (based on whether there was an emotional word in the post) for a group of people (about 0.04% of users, or 1 in 2500) for a short period (one week, in early 2012)…. And at the end of the day, the actual impact on people in the experiment was the minimal amount to statistically detect it. " Fair enough, though let's not forget that the research abstract describes it as a "massive experiment on Facebook."

Companies experiment with our heads all the time. Their business profits rely on knowing that we want our soda brown and our mouthwash green. They know how potent our loyalty and our nostalgia can be. The difference is that those of us who use Facebook to see our friends' vacation photos and discuss what "Firefly" character we'd be often forget that it's a product. It's a product that happens to rely on the word "friend" for its success, a product that we participate in creating content for, with every post, with every friending, with every "like."

Ultimately, Kramer says that "Our goal was never to upset anyone. I can understand why some people have concerns about it, and my coauthors and I are very sorry for the way the paper described the research and any anxiety it caused. In hindsight, the research benefits of the paper may not have justified all of this anxiety." It's a sincere sounding acknowledgment of yet another awkward moment of Facebook's unclear boundaries with its membership. But it's also a reminder to all who engage on Facebook that this grand social experiment is sometimes literally a grand social experiment. And if Facebook is seeking emotional responses from its users, it can always look no further than how people react when they learn they've been its unwitting test subjects.


By Mary Elizabeth Williams

Mary Elizabeth Williams is a senior writer for Salon and author of "A Series of Catastrophes & Miracles."

MORE FROM Mary Elizabeth Williams


Related Topics ------------------------------------------

Facebook Privacy Social Media