"I get really affected by bestiality": Meet the moderators who watch horrific videos so you don't have to

For eight hours a day, these Internet garbagemen stare into the dark underbelly of humanity. The job takes its toll

Published October 31, 2014 10:45AM (EDT)

This article originally appeared on AlterNet.

AlterNet When you scroll through Facebook or Twitter, you expect to see photos of friends’ recent vacations, well-wishing birthday posts and Buzzfeed articles about porcupines eating pumpkins. What you don’t expect are graphic images of child pornography or videos of people having their heads cut off. Though many of us don't know it, a vast labor force of content moderators, largely based overseas, works day and night to keep our social media feeds free of this sort of offensive material.

In an investigation for Wired, Adrian Chen travels to the Philippines to get the behind-the-scenes stories of the men and women responsible for sanitizing the experience of American Internet users. As Chen learns, there are well over 100,000 of these laborers, many of whom work in rundown urban areas outside of Manila. Their wages range from $300-$500 per month, and the work they do is draining. Moderators sit in front of computer monitors for hours at a time, sifting through a constantly updated stream of content that has been flagged as inappropriate: sexual solicitations, dick pics, savage street fights, suicide bombings. They repeat this cycle day in and day out, struggling to keep up with the oppressive volume of offensive content that never stops accumulating.

Needless to say, being exposed to the dark underbelly of humanity for eight hours each day takes a toll. The contractors Chen interviews describe struggling with insomnia, substance abuse, depression, and even post-traumatic stress disorder. Though some companies offer free counseling to their workers, many do not, and moderators are left to heal their psychological wounds on their own. A woman named Maria, who works as a quality assurance representative for a contracting firm, told Chen about her coping strategies.

“I get really affected by bestiality with children,” she says. “I have to stop. I have to stop for a moment and loosen up, maybe go to Starbucks and have a coffee.” She laughs at the absurd juxtaposition of a horrific sex crime and an overpriced latte.

Though the bulk of this labor is outsourced overseas, the more specialized content screening is done here in the U.S., mostly by recent college graduates who are between permanent gigs. As Chen points out, U.S.-based moderators earn much more than those in other countries, and “a brand-new American moderator for a large tech company in the US can make more in an hour than a veteran Filipino moderator makes in a day.” But the psychological damage is the same, and the young grads screening videos for Bay Area startups are just as haunted by the images scrolling across their screens.

Rob, who used to work as a moderator at YouTube, recalls the job as an exercise in cognitive dissonance. Those who do it to have to become desensitized to the graphic images they process, so they can’t allow themselves to recognize the emotional and psychological burden they are carrying with them. Rob gained weight and turned to alcohol as a release. As he put it:

“If someone was uploading animal abuse, a lot of the time it was the person who did it. He was proud of that. And seeing it from the eyes of someone who was proud to do the fucked-up thing, rather than news reporting on the fucked-up thing—it just hurts you so much harder, for some reason. It just gives you a much darker view of humanity.”

The removal of offensive content is one way our Internet experience is curated and polished without our knowledge. Another is the monitoring of the thousands of comments users post on social media networks and online publications. Anyone who has spent any time poking around the comments section on CNN or Reddit—or AlterNet, for that matter—knows that they are often cesspools of racism, bigotry, sexism and name-calling. But while we can close our laptops in disgust, comment moderators, like content moderators, must absorb the brunt force of the burbling mass of hatred and raw human id that is the Internet.

As recent articles in the New Inquiry and Jacobin noted, the bulk of comment moderation is done by young, poorly compensated women. They are, as the Jacobin piece phrased it, “the human buffers between an outraged public and the publication itself.” Even higher-ups within a publication are rarely aware of what gets said in the comments section; only the female college grads scrubbing the threads bear witness to all of the vitriol directed at the writers, the other commenters, and often the publication itself. In the New Inquiry piece, Jason Wilson argues that “this systemic resilience relies on a precariously employed and female labor force. We must understand how they are deputized to shore up the legitimacy of institutions which have historically excluded and currently exploit them, freeing the powerful to present all of this as a democratic undertaking.”

The same case can be made for the content moderators staring at computer screens in dingy offices thousands of miles across the ocean. Because online companies rely on an invisible, silent, underpaid labor force to do the dirty work of damage control, it is too easy to forget that these gatekeepers exist. Sarah Roberts, a media studies scholar at the University of Western Ontario, tells Adrian Chen that companies and social media users alike would prefer to pretend that they do not. As she says, “It goes to our misunderstandings about the Internet and our view of technology as being somehow magically not human.” Rob, Maria, and the thousands of other moderators around the globe are living proof that it is.


By Allegra Kirkland

MORE FROM Allegra Kirkland


Related Topics ------------------------------------------

Alternet Internet Moderators Philippines