If you hop on to Facebook or Twitter or Instagram (or really any of the major social networking platforms), you’re probably going to see a lot of pictures. You’ll see what your friends just ate, what they looked like as toddlers, what their cat does when it’s finally confronted with its reckless behavior — stuff like that. One thing you probably won’t see? The kind of horrifying images of humanity at its most twisted that used to be far more common on the Internet, back when it wasn’t as much of a mainstream space.
And as former Gawker reporter and current freelancer Adrian Chen detailed in a fascinating recent article for Wired, that softening of the Internet’s most jagged edges did not come about by happenstance or accident. On the contrary, for each of these massive social networks, there are human beings — lots of them — whose job it is to “moderate” its users’ “content.” It sounds like a terrible job, of course; but until someone builds an algorithm capable of differentiating between “Game of Thrones” and ISIS video clips, someone’s got to do it.
Recently, Salon called Chen to discuss the content moderators’ lot, why we don’t more often hear about it and whether it’s a glimpse into the post-industrial “knowledge economy” future proponents of globalization tell us so much about. Our conversation is below, and has been edited for clarity and length.
One of the most striking quotes in the piece comes from one content moderator who says she gets "really affected" by images of "bestiality with children." So I have two questions: 1. What kind of job makes saying this sentence even possible? And 2. Is this the worst job in the world?
The job they’re doing is basically screening content that’s posted to social media or cloud storage, and just making sure that whatever is posted fits the standards of the company that is running the service. They’re really kind of the police of the Internet, and this requires seeing thousands and thousands of images and robotically going through this checklist to make sure they fit with the [company's] guidelines.
The amount of really bad stuff they see varies depending on the job. Sometimes it’s all really graphic pornography, if you’re screening a porn service; or sometimes maybe you only see one violent image every hour or so. I don’t know how horrible it is, really. I think that some people can deal with it better than others. I know that some people I talked to actually prefer it to doing call center work because they said it was less stressful than dealing with angry customers.
But for some of the people who are affected by it, it’s very difficult. People with young children who have to look at child exploitation, child abuse images … It’s not pleasant. I think what makes it really difficult is the pairing of these really dark images with the monotonous, robot-like nature of the work. It really seems to feel like adding insult to injury.
To your point about people responding to it differently, the sense I got from the piece was that workers in the Philippines are a lot happier with their lot than are their counterparts in the States.
One of the things I didn’t really get into in the piece was that for the people in the Philippines, this was a career, kind of. When they join an outsourcing firm it’s typically a full-time job; you get benefits, you have a community, and this is a very hyped-up industry in the country and everybody wants to get to be a part of it. In that sense, a lot of people in the Philippines are, at least initially, excited to have found work in this industry and also a steady job.
But in the U.S., it’s way more likely that the job is going to be on contract; the people who I spoke to at Google were hired on a one-year contract. They didn’t really have benefits. They could renew their contract after a year, but it was really just a temporary thing. They were also working at YouTube among all the normal YouTube and Google employees on this beautiful campus and it seems like they felt like second-class citizens, and that definitely added to the stress of the job.
It shows that it’s not just the content, it’s also the situation [that is difficult for American workers]. For the Filipino workers, it seems like all of the added support and permanence of the job helped them deal with it (although they were obviously all being affected by the content, too).
So it sounds like the job's position as a status symbol in the respective cultures had an impact on how much workers were able to handle its indignities.
Yeah, that was kind of surprising to me — that it seemed like the people who were doing it in America were feeling much more disposable than in the Philippines.
Maybe part of why the U.S. workers feel disposable is because their existence is so hidden? We've had these social networking platforms for a while now, but your story is one of the very few I can think of that even bothered to look at this section of the Internet economy.
It’s kind of crazy how completely hidden it is. There have been some small stories about it … but nobody had really looked at it as a central part of what’s going on in social media, which it is. I think that’s because there’s an idea that this must be automated, that when you hit "report" on Facebook or Twitter or YouTube, they make it seem as machine-like as possible. You press a button and then you click some stuff and then it goes away and you never know what happened to it. There’s really an opacity about it.
Any guess as to why?
I think that the companies obviously don’t want to talk about their worst content; and I also think they're really sensitive about outsourcing. Facebook outsources a lot of this work … I think if people realized their content was going to the Philippines they might be a little nervous about that. These [social media] companies just want to make it seem like their entire product is being created by a couple thousand really rich 20-year-olds in Silicon Valley.
How much do you think that disconnect — between what these tech giants want people to think about how their products work and the reality of life behind the curtain for their workers — bothered the people you spoke to?
I think there are ways that the secrecy surrounding it really impacted them. A guy [who did content moderation for] YouTube I talked to said that whenever he would see a really terrible image and he would send it along, he would never know what happened to it after that. He would report something to the police and he never knew what happened next. And he just felt very disheartened by that, because it often was a really intense thing he’d seen and got upset by and then it just would go away and be totally decontextualized.
In the Philippines, people aren’t allowed to even talk about their job with their co-workers who aren’t in the same division. I know they’ve done studies of online crime investigators who have said they feel isolated because they can’t talk about this stuff, because it’s just really unpleasant and you don’t really want to share it with other people. I think anything that increases the sense of isolation and that you’re contaminated by this stuff will increase the unpleasantness of the experience.
I wonder if these companies are hesitant to be more forthcoming about how the sausage is made because they know some of their most influential customers — people in the middle and professional classes — are starting to associate globalization as a threat to their futures, too, and not just a problem for blue-collar folks and the working class.
I talked to Sarah Roberts, a media studies scholar at the University of Western Ontario, and I wasn’t able to include this in the piece but … she asked, [what if] the “knowledge” work that was supposed to be the really good jobs of the future actually turns out to be as dehumanizing and unpleasant as the worst [forms of physical] labor that have always existed?
I think that as we go on you’re going to see more and more jobs like this as being just what people do. I think there’s going to be tons and tons of menial information jobs, even more than there already are, that take this assembly-line approach to content … Because all this big data ultimately rests on a huge amount of labor from people that is kind of invisible.