Google's genius plan to eradicate porn in search results

A lot of innocuous phrases have sexual images attached. Google wants you to find them

Published July 23, 2014 2:23PM (EDT)

               (<a href='http://www.istockphoto.com/user_view.php?id=346033'>redmal</a>, <a href='http://www.istockphoto.com/stock-photo-12888075-sexy-woman-in-lingerie.php?st=4c3a674'>Asian</a> via <a href='http://www.istockphoto.com/'>iStock</a>/Salon)
(redmal, Asian via iStock/Salon)

It happens from time to time: What seems like an innocuous Google image search somehow turns into a NSFW Internet escapade, with X-rated photos that seem to load infinitely as one scrolls through the results. These types of searches are exactly what it seems Google is trying to prevent with its recent porn eradication policy, which was first announced in March. But, as Motherboard writer Brian Merchant discovered by chance recently, it also seems that seemingly "clean" search terms and associated user complaints over results might be exactly how Google is getting rid of all the porn. After all, there's a lot of porn out there to find.

When Merchant conducted a search for the term "big ol'" -- what kind of bad results could that possibly dredge up? -- he immediately encountered "aggressively repulsive results," including several images of bestiality. He contacted Google about how the search in question relates to the company's porn eradication policies, and found that his "complaint" is exactly what sets the removal process in motion -- ostensibly for all the porn:

I asked about their image removal policies, and whether Google proactively sought out to remove offensive imagery, or waited to receive a complaint. I didn't get a response to that query, or much clarity about what Google's internal system for removing offensive content currently is.

After a multiple-day delay, a Google rep, Jason Freidenfelds, told me that "No filter is perfect, and we're always improving our systems; we appreciate the feedback."

"We've updated things so within a few days the inappropriate results shouldn't show up anymore," he added.

Merchant was also directed to Google's removal policies, which characterize removable "offensive material" as pornography (including search terms with multiple meanings that might not seem offensive, but could be), bodily functions or fluids, vulgarity, graphic physical injuries or depictions of death and animal cruelty. As he points out, Google is attempting to manage a huge store of information, and it's up to the company whether or not it will allow pornography. When it comes to seeking out porn in order to eradicate it, however, there's room for skepticism about the efficacy of the project and about who will determine what constitutes "offensive" material in the first place. There's plenty of it out there, but it's often mistaken for being perfectly acceptable.


By Jenny Kutner

MORE FROM Jenny Kutner


Related Topics ------------------------------------------

Google Internet Porn Pornography Search Results Vice