When Facebook polices the trolls, its workers may pay a price

As Facebook steps up moderation of gruesome content and personal abuse, employees may face new risks

Published May 4, 2017 6:27PM (EDT)

 (Getty/PeopleImages)
(Getty/PeopleImages)

On Wednesday Facebook CEO and co-founder Mark Zuckerberg announced that the social-media sector leader would be hiring 3,000 new employees to help monitor the daily deluge of inappropriate content, hate speech, child exploitation and other digital abuse that streams ceaselessly across the platform.

"Over the next year, we'll be adding 3,000 people to our community operations team around the world — on top of the 4,500 we have today — to review the millions of reports we get every week, and improve the process for doing it quickly," Zuckerberg wrote in a post published to his Facebook page.

The move is not only a response to the vicious culture of trolling that the platform accidentally supports, but also comes in the wake of a series of grim, highly publicized acts shared with the service's audience, potentially as high as 1.86 billion users a month.

In particular, the Facebook Live video platform has been used to broadcast several killings in real time. In Cleveland the service streamed the random shooting of a 74-year-old man. In Thailand a man documented the murder of his 11-year-old daughter on the social-media platform. His video remained visible on Facebook for roughly 20 hours.

Increased moderation is, of course, a welcome initiative from the tech giant, even if the task seems daunting not only to the company as a whole, but also for the individual moderators who will have to police Facebook's very Wild West.

In an interview with Mashable, David Ballard, an expert in creating psychologically healthy workplaces and an assistant executive director for Organizational Excellence at the American Psychological Association, called the job "new territory."

"It’s essentially vicarious traumatization," Ballard said, noting that the workers are "not in the life-threatening situation themselves, but they're viewing a situation that's overwhelming, extreme, and upsetting." He continued, "Finding a good fit for individuals who can function in that environment, exposed to that type of content, is really going to be important."

Ballard suggested that bringing on low-level or entry-level candidates just won't cut it and that employees will need to undergo proper evaluation. Mashable writer Rebecca Ruiz added in the same piece that experts are aware, "from previous reporting . . . that people who moderate online content, including abusive language and still images, are often poorly paid and experience negative psychological consequences."

It's not clear at this time if Facebook will hire contractors or full-time employees who will receive benefits. The company certainly has a murky track record when it comes to content monitoring.

As The New York Times reported:

Despite Mr. Zuckerberg’s pledge to do a better job in screening content, many Facebook users did not seem to believe that much would change. Hundreds of commenters on Mr. Zuckerberg’s post related personal experiences of reporting inappropriate content to Facebook that the company declined to remove.

Most of the company’s reviewers are low-paid contractors overseas who spend on average of just a few seconds on each post. A National Public Radio investigation last year found that they inconsistently apply Facebook’s standards, echoing previous research by other outlets.

In his Wednesday post, Zuckerberg seemed to be plotting a different course for the work ahead.

"In addition to investing in more people, we're also building better tools to keep our community safe. We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer," he wrote.

"No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need," he added.

 


By Charlie May

MORE FROM Charlie May


Related Topics ------------------------------------------

Facebook Facebook Live Mark Zuckerberg Mashable New York Times