Facebook's black market problem revealed

As Facebook continues to rake in cash from brands and publishers, questions arise about the real value of "likes"

Published February 14, 2014 12:43PM (EST)

Mark Zuckerberg             (Reuters/Beck Diefenbach)
Mark Zuckerberg (Reuters/Beck Diefenbach)

"Advertising your page on Facebook is a waste of money."

By the time this flat statement arrives in the YouTube video "Facebook Fraud" -- uploaded on Monday, and watched more than a million times by Thursday morning -- viewers have been led through a fascinating tour of some troubling features of Facebook's "Like"-based economy.

The story behind the video goes like this: Its creator, who goes by the handle Veritasium on YouTube but is named Derek Muller in real life, had long been suspicious that thousands of "likes" on a page he had paid Facebook to promote two years ago were actually bogus.  So he created another Facebook page designed to be so purposely awful that no sentient being would click on it, and then paid the social network to promote that one too. People did end up clicking on it almost immediately, but in Muller's opinion, a close examination indicated that it was unlikely that they were genuine fans of his horrible page. Most of these people were wildly indiscriminate in their "liking" behavior, clicking the like button on hundreds and hundreds of random pages. To Muller, they appeared to be  fakes generated by "click farms" -- sweatshops full of low-paid drones who spend their days "liking" stuff on Facebook.

In other words, they were useless. Fake likes don't represent real "engagement" with your page. They don't translate into comments on your posts, or result in those posts being shared with other people. And that's a big deal, for a subtle reason: Facebook's algorithm decides how prominently to feature your posts in News Feeds by paying attention to how often your friends and followers engage with your content. If you post something new, and all those people who previously liked your page proceed to actively comment on it and share it and like it, Facebook will automatically give that post a boost. But if all those thousands of people who have liked your page never show up, your content gets downgraded, and nobody sees it.

So here's the deal: Muller makes a living from his YouTube videos. He pays Facebook to promote his Facebook Page that featuring those videos. The problem is that he ends up with lots of fake followers from bogus likes. Not only does this deprive him of the advertising dollars he would receive from YouTube for video views -- because those fake followers don't end up actually watching his videos -- but it also ends up hurting his Facebook "EdgeRank," the secret sauce that determines how prominently his videos get featured in the News Feed without any paid promotion at all.

Muller's video arrived at a fortuitous moment, right in the middle of a raging debate about how tweaks in Facebook's News Feed algorithm have affected the economic viability of major online publishing sites like Upworthy and BuzzFeed. Business Insider even went so far as to accuse BuzzFeed of buying traffic from Facebook, and attributed a sharp drop in Upworthy's circulation to changes at Facebook designed to make Upworthy's content less popular. (But more on that later.) As traffic to content publishers becomes more and more dominated by social media sharing, everybody wants to understand what's inside the black box that is Facebook's News Feed algorithm. The stakes here are nothing less than the future of online publishing.

And it's scary -- or downright ridiculous -- that something as evanescent as a "like" button clicked by a poorly paid worker in Dhaka, Bangladesh, is one of the fundamental building blocks of this new economy.

* * *

Click-farm fraud isn't news. The existence of sweatshops devoted to virtual labor goes at least as far back as the popular online video game "World of Warcraft": Thousands of people were put to work accumulating virtual gold by playing the game, solely so that gold could then be sold to real "Warcraft" players who wanted to buy high-priced in-game weapons and armor.

An AP investigation published in January uncovered scores of click farms, mostly in developing nations like Bangladesh and Indonesia. Other investigations have found similar practices. A cursory Google search reveals hundreds of firms advertising the sale of Facebook likes, as well as Twitter and Instagram followers. At the aptly named WeSellLikes.com, for example, $54.99 will buy you 5,000 Facebook likes. Anything that can be monetized will be.

This week, I briefly interviewed the proprietor of WeSellLikes (who asked to remain anonymous). He told me that he generated the likes by paying "real people" with Facebook accounts to click on the pages his customers want to boost. A veteran of the search engine optimization industry, he told me that thousands of people input the keywords "buy Facebook likes" into Google every single day. The market opportunity is obvious: Italian researchers cited by the AP estimate that fake Facebook activities add up to about $200 million a year in revenue.

Facebook has long considered the practice of paying for likes to be a violation of its terms of service. A spokesman for Facebook assured me that the company is always working hard to shut them down. And just as Google is always tweaking its Page Rank algorithm to keep up with sketchy search engine optimization techniques, Facebook will always be required to upgrade against new abuses of its own safeguards. That's never going to change -- the techno-dialectic bot-vs.-bot game is a permanent feature of our digital landscape.

When I asked Facebook about the "Facebook Fraud" YouTube video, I was told that "Fake likes don’t help us. For the last two years, we have focused on proving that our ads drive business results and we have even updated our ads to focus more on driving business objectives. Those kinds of real-world results would not be possible with fake likes. In addition, we are continually improving the systems we have to monitor and remove fake likes from the system."

Translation:  Fake likes are bad for business, because they hurt Facebook's ability to prove that its advertising actually works.

It's true that most obvious click-farm abuses reported by "Facebook Fraud" (80,000 likes from Bangladesh, Egypt, Indonesia and Sri Lanka) occurred two years ago. And because Muller had restricted the promotion for his purposely terrible page to four Western countries -- so as to exclude obvious sites for click farms -- there is no longer the kind of smoking gun that 10,000 likes from Dhaka or Jakarta might provide.

So who knows?  Maybe there really are lots of people out there in the world who happen to enjoy clicking "like" on thousands of random pages. Muller, however, remains skeptical. The people who liked his fake page showed zero signs of engagement with it. In his view, a like that "never" results in engagement "is a bogus like and should be deleted."

As for the assertion that Facebook has significantly upgraded its system?

"This is perhaps most worrying of all," Muller told me. "What they are saying is in essence: In the old days, sure, fake likes could happen, but not now. What troubles me most about this admission is they have done nothing to correct the problem. If they're aware those 80,000 likes are dead weight they should have eliminated them. And they have since benefited from those 80,000 likes ... I paid to boost posts out to these useless likes. That is a problem!"

"Here is the big problem with fake likes on Facebook," he continued. "Unlike a fraudulent click on Google, these fakes stay with you forever (even two years later when Facebook's fraud detection has moved on). They weigh on your engagement and EdgeRank because the accounts never intended to engage with you. And then you end up paying again to boost the post out to them -- and they were never real in the first place!"

* * *

On Feb. 10, Business Insider's Nick Carlson reported that Web traffic registered by the viral good-news sharing site Upworthy had plummeted by 46 percent since November. He theorized that changes made in December as to how Facebook's News Feed algorithm worked were responsible. It wasn't an entirely unreasonable assumption. Facebook executives have made it clear that they see the News Feed as a home for high-quality content -- they'd rather not see it overrun by memes of the day, cat pictures and algorithm-engineered headlines like "This giraffe fell down an escalator; what happened next will make you weep with joy."

Problem is, Carlson didn't have any actual evidence other that Upworthy's traffic figures. Facebook's News Feed algorithm is a black box -- only Facebook's engineers have any real clarity on how it works. And when Carlson went further, and alleged that BuzzFeed's continuing growth in traffic was likely the result of direct financial compensation from Facebook, he had to embarrassingly back down after BuzzFeed protested.

On Feb. 12, Slate's Will Oremus jumped into the ongoing conversation about Facebook traffic with the provocative suggestion that reporters had become so willing to latch on to any conspiracy theory that painted Facebook in a negative light that they had become out-and-out "Facebook truthers." He cited Carlson's Business Insider article as Exhibit A, and the excitement with which reporters seized upon Derek Muller's "Facebook Fraud" video as Exhibit B.

Is it theoretically possible that Facebook could rig its news feed in this way? I suppose so. The company considers its algorithms a secret sauce, so we can’t know for sure what’s in them. But rigging the news feed would not only be unethical -- it would also be terrible for Facebook’s own business. Ads on Facebook are displayed in a few discreet slots within the news feed, and the ads you see are determined by different algorithms than the organic posts in your feed. Buying ads on Facebook, then, ought to get you one thing: ads. The news feed, meanwhile, is the core product by which Facebook attracts, retains, and engages its users, and the company has huge teams of engineers and machine-learning experts working constantly to fine-tune its algorithms to show people the posts they most want to see. If they had to go in and muck with their code every time the ad team struck a deal with the likes of BuzzFeed, they’d rebel. Meanwhile, the news feed would suffer, and users would flee.

But Oremus doesn't directly take on the dynamic highlighted by "Facebook Fraud" -- the way that ad placement and promotion influence exactly those engagement characteristics that the News Feed uses to make its own rankings. Facebook effectively allows users to pay to influence Facebook's rankings -- and if that isn't a conflict of interest, I'm not sure what is.

As for issues of ethics, it was only 18 months ago that Facebook settled multiple charges made by the Federal Trade Commission that the company had deceived viewers on privacy issues. Oremus also didn't mention the ruckus that broke out just one year ago, when changes to the News Feed had a dramatic effect on the user engagement that everyone from Mark Cuban to George Takei to New York Times tech columnist Nick Bilton were noticing from their followers. The fact that the changes occurred at the exact same time that Facebook had introduced a new paid promotion service to ensure that your followers saw all your posts struck many observers as more than a little fishy.

In response, Facebook said the same thing then that it is saying now: It wants to ensure that viewers see what is most relevant and interesting. But introducing a paid promotions service at the same time that News Feed changes resulted in startling changes in how often your own followers and friends saw your own posts still looked a lot like a shakedown to a lot of people.

It is hardly Facebook "trutherism" to observe that a publicly traded company has a profound incentive to boost revenue in order to keep its stock price high and its shareholders happy. Perhaps if we had more transparency into how Facebook works, we could reassure ourselves that Facebook is immune to the kind of economic pressures that would influence tweaking placement algorithms so as to maximize revenue.

But that's the nub of the issue. That's why Facebook's "black box" is worth so much scrutiny. In this social-media house of cards, Facebook holds all the cards. Its algorithms determine how many people see your posts, and how many ads you see promoting other people's posts. By making "engagement" determine how highly the News Feeds ranks your posts, and by actively selling the process of engagement, Facebook has not only defined the rules of the game; it's also the only player that knows the rules. Like the U.S. government, it is minting its own currency, regulating its use, and setting interest rates, but all the while the underlying legislation and constitution defining the rules of the game are locked away in a vault.

Facebook is far from alone in behaving this way. One way or another, every social media platform and search engine writes its own rules and enforces them. The more we learn about it, the more it seems like remarkably unstable ecology. A tweak to a few lines of code here, and the traffic fire hose suddenly cuts off or expands a hundredfold there. Meanwhile, all over the world, dodgy virtual sweatshop operators and search engine optimization scammers are constantly seeking ways to game the system. It's unpredictable, fragile and prone to abuse.

And it's the future! Yay!


By Andrew Leonard

Andrew Leonard is a staff writer at Salon. On Twitter, @koxinga21.

MORE FROM Andrew Leonard


Related Topics ------------------------------------------

Content Facebook Like Button Mark Zuckerberg Publishing Social Media