Let's give OkCupid some credit. The online dating service's reaction to the uproar over Facebook's emotional manipulation experiment is about as forthright as one could expect: Basically, we do the same thing all the time, and it's awesome!
Facebook tinkered with changing the flow of "positive" and "negative" items in a news feed to see if it would change the emotional tone of posts made by the manipulated users. In a blog post published Monday titled "We Experiment on Human Beings!" OkCupid founder Christian Rudder analyzed a series of similar studies undertaken by the dating site. The third such "experiment," which Rudder titled "the power of suggestion," is the most likely to raise eyebrows.
OkCupid relies on an algorithm to tell its users how much of a "match" they are with other site users.
By all our internal measures, the “match percentage” we calculate for users is very good at predicting relationships. It correlates with message success, conversation length, whether people actually exchange contact information, and so on. But in the back of our minds, there’s always been the possibility: maybe it works just because we tell people it does. Maybe people just like each other because they think they’re supposed to? Like how Jay-Z still sells albums?
So OkCupid starting skewing the data, telling some users that they were a bad match when they were actually compatible, while telling other users that they were compatible when the algorithm determined otherwise.
Guess what? "When we tell people they are a good match, they act as if they are. Even when they should be wrong for each other."
A footnote to Rudder's article says that users were informed of their "true" match percentages after the experiment. But I'm not sure that absolves OkCupid of any potential wrongdoing. Nor does the "everyone does it" defense:
We noticed recently that people didn’t like it when Facebook “experimented” with their news feed. Even the FTC is getting involved. But guess what, everybody: if you use the Internet, you’re the subject of hundreds of experiments at any given time, on every site. That’s how websites work.
Rudder and the technologists who have defended Facebook are missing something important. There's a big different between straightforward A/B testing -- presenting two different versions of a site to different groups of users in order to see what works better -- and consciously presenting false information or otherwise skewing emotionally laden data. One is completely acceptable tinkering designed to improve usability, while the other is irresponsible behavior that treats human beings like lab rats and their emotions as play toys.
Sure, the line that separates one from another is murky and confusing. But there is a line. Silicon Valley needs to start asking itself the question: How do we think our users would feel if they knew we were feeding them false information? If they don't understand the natural anger and betrayal that would likely ensue, then maybe they shouldn't be running websites that are designed to promote human connection.
The scary thing is, when you see a title like "We Experiment on Human Beings!" you start to realize that many of these people don't even realize that there is a line that shouldn't be crossed in the first place.