Why the "everybody does it" defense of Facebook's emotional manipulation experiment is bogus

Yes, every Internet company is attempting to modify our behavior. That doesn't make it right

Published June 30, 2014 4:59PM (EDT)

  (Reuters/Stephen Lam)
(Reuters/Stephen Lam)

For one week in 2012, Facebook manipulated the emotional content of the news feeds of more than 600,0000 of the social media network's users to see how they would respond. For one group of users, the frequency of posts that contained words like "love" or "nice" decreased. For another, the number of posts containing words like "nasty" or "hurt" declined.

The researchers discovered that the filtering resulted in a very small -- but observable -- effect. Moods are contagious! Users who experienced more negativity in their news feeds tended to express themselves more negatively in their own pasts. As a zillion Facebook posts reported over the weekend: Facebook made some of its users sad, on purpose.

The publication of the research paper reporting the findings sparked enormous debate and criticism, for obvious reasons. But for at least one prominent Silicon Valley agenda setter, the uproar was unfounded. Big deal, tweeted venture capitalist -- and Facebook investor -- Marc Andreessen. Everybody needs to chill out!

[embedtweet id="483020875776540672"]

Is there a phrase that reeks more powerfully of entitled condescension than the words "helpful hint" at the beginning of a tweet? The more Marc Andreessen dismisses criticism of Silicon Valley business behavior, the more he embodies exactly the fortress of self-satisfied complacency that requires storming.

Bringing books and TV into this argument is just silly. When we turn on "Game of Thrones" or pick up "The Goldfinch" we are actively and knowingly seeking out emotional manipulation. That's a bargain we willingly make and it’s a very different thing from having a news filter feed altered without our knowledge.

But sure, it's also hard to argue with Andreessen's larger point, which seems to be, hey, everybody is manipulating us, all the time, so quit your bitching. Andreessen also linked to a post in defense of Facebook by Tal Yarkoni, the director of the Pyschinformatics Lab at the University of Texas. Yarkoni eloquently made the point that the Internet is a nonstop festival of manipulation.

The reality is that Facebook – and virtually every other large company with a major web presence – is constantly conducting large controlled experiments on user behavior. Data scientists and user experience researchers at Facebook, Twitter, Google, etc. routinely run dozens, hundreds, or thousands of experiments a day, all of which involve random assignment of users to different conditions. Typically, these manipulations aren’t conducted in order to test basic questions about emotional contagion; they’re conducted with the explicit goal of helping to increase revenue. In other words, if the idea that Facebook would actively try to manipulate your behavior bothers you, you should probably stop reading this right now and go close your account. You also should definitely not read this paper suggesting that a single social message on Facebook prior to the last US presidential election the may have single-handedly increased national voter turn-out by as much as 0.6%). Oh, and you should probably also stop using Google, YouTube, Yahoo, Twitter, Amazon, and pretty much every other major website – because I can assure you that, in every single case, there are people out there who get paid a good salary to… yes, manipulate your emotions and behavior! For better or worse, this is the world we live in. If you don’t like it, you can abandon the Internet, or at the very least close all of your social media accounts.

So -- we're all lab rats, all the time, and everything we see or experience is a product of manipulation designed to modify our behavior. All true -- every word of it.

But the implicit argument here -- that we shouldn't get so worked up about how we are being manipulated because manipulation is just a fact of life -- is unsatisfactory. We should always be striving to understand the forces working upon us. Media literacy is no joke -- understanding how advertisers and politicians and, yes, social media networks, play on our emotions is important. Before the publication of the Facebook experiment, many Facebook users probably didn't realize that this kind of manipulation was even possible, much less regularly practiced. Now they do, and that's a good thing for us, though not necessarily for Facebook.

Because here's the thing that's not getting enough attention from Facebook's defenders. There's a reason why this study struck a hot nerve. People already don't trust Facebook, for a wealth of historically grounded reasons having to do with privacy and shady advertising techniques. We didn't like it when we found our defaults changed to make more of our information public. We weren't enthused when we liked a company's Facebook page, and found our "endorsement" transformed into a "sponsored story" advertisement. Now we discover that the mere fact of agreeing to Facebook's terms of services means researchers can play games with our emotions? We wouldn't like learning that about any company, but we especially don't like learning about it with Facebook, because we don't trust Facebook to do what's right by us.

Helpful hint, Marc Andreessen: When the company that is conducting emotional contagion experiments with users has already been smacked around by the FTC for actively deceiving consumers about privacy, it doesn't really excuse your behavior to say, hey, everybody does this. Because a) everybody isn't like Facebook, and b) the more aware we are of how everyone is manipulating us, the healthier we'll be as a society.


By Andrew Leonard

Andrew Leonard is a staff writer at Salon. On Twitter, @koxinga21.

MORE FROM Andrew Leonard


Related Topics ------------------------------------------

Emotional Manipulation Emotions Facebook Marc Andreessen Mark Zuckerberg Silicon Valley