"Alt-right" vs. YouTube: Hitting white supremacists where it hurts

Forget Facebook and Twitter. YouTube is the alt-right's lifeblood, and now the video platform is fighting back

By Amanda Marcotte

Senior Writer

Published March 8, 2018 5:00AM (EST)

 (Shutterstock/Salon)
(Shutterstock/Salon)

The rat's nest of right-wing conspiracy theorists and white nationalists that comprises the "alt-right" community online now appears to be in a full-blown crisis. After every high-profile mass shooting, "alt-right" thought-leaders like Alex Jones of Infowars get to work spreading conspiracy theories: They claim the whole thing was a "false flag" operation and that crying survivors on TV are "crisis actors." But this time, when the targets of these outrageous accusations were the teenage survivors of the Parkland, Florida school shooting, the "alt-right" got significant and negative coverage from the mainstream media.

The online troll army has weathered its share of bad press before, but this time the blowback is hitting them where it hurts: YouTube. The Google-owned video platform has threatened to remove the accounts of prominent "alt-right" vloggers and has suspended the live-streaming privileges of others. Advertisers are pulling out of Infowars, after a CNN report showing that mainstream companies like Nike and 20th Century Fox had ads running on the show's YouTube clips. Now the reactions of the "alt-right" -- and its slightly more mainstream allies in conservative media -- are starting to get a little shrill with panic.

YouTube is probably the "alt-right's" "most important social network for growth," said Melissa Ryan, a visiting fellow at Media Matters. "They’re freaking out, and with good reason. They’re freaking out differently than when they’re banned from Twitter."

“This isn’t the usual, ‘We’re under an attack, thank you for the support, share links.’ No, no, no. This is — we’re done, we’re off the internet, they’re banning us,” complained conspiracy theorist Mike Cernovich on a recent episode of Infowars, during which he and Jones vowed to organize a protest of Google at this year's SXSW conference in Austin, Texas.

Cernovich added, "They want us in gulags, Alex," and darkly warned that "death camps are coming."

Later in the same broadcast, Jones claimed he was being targeted because he had "busted over 30 pedophiles killing children who were being held as slaves." (He hasn't.)

Gavin McInnes of CRTV offered a more pedestrian xenophobic theory, suggesting that "everyone at these social media companies has an accent" and "H1-B visas with an axe to grind." (In other words, blame the immigrants.)

More mainstream groups, running interference for white nationalists as they typically do, blamed the "trusted flaggers program." The Daily Caller, the Heritage Foundation, and Tucker Carlson have all claimed the Southern Poverty Law Center was a trusted flagger and denounced YouTube for supposedly letting the venerable anti-racist organization target white supremacist content for removal.

In reality, YouTube isn't taking down conspiracy-theory videos and white supremacist rants efficiently enough, but this panicked reaction speaks volumes about why YouTube, far more than Facebook or Twitter, has become the go-to online platform for spreading the far-right message.

Much of the success of YouTube's "conspiracy community," said Rebecca Watson, a progressive YouTube personality, stems from the platform's character as "a great, unmoderated echo chamber." She often creates videos meant to counter the conspiracy theories that frequently go viral on YouTube.

"There’s not an easy way to know what is in a video without watching a video," explained Jared Holt, a researcher for Right Wing Watch, a project of People for the American Way. "Alt-right" YouTubers understand that this makes it impossible for the service to devote adequate manpower to moderating content, given the enormous volume of video uploaded every day. 

There's also "not a lot of pushback against these people" from progressives on the service, Holt added. In a sense, YouTube allows white nationalists to hide in plain sight. Few liberals have the stomach or time to actually watch and respond to right-wing videos that spread fabricated theories or outrageous falsehoods. Social media in text form, as on Twitter or Facebook, attracts far more criticism, but videos can transmit far-right ideas far more widely while encountering relatively little pushback. 

Rebecca Watson also noted that despite recent attempts to crack down on "fake news," YouTube and similar services rely on "algorithms that are based on popularity and 'upvotes,' and are easily gamed." "Alt-right" users have become experts in exploiting such algorithms to get more eyes on their videos.

Last week, Kelly Weill of the Daily Beast exposed the ways "alt-right" trolls push their content on YouTube, often by creating "fake accounts in order to boost visibility of their preferred videos and bury videos they don’t like."

This matters, said Ryan of Media Matters, because the "alt-right" is focused primarily on recruiting young people and there's "a readymade audience of young people" who "think of YouTube the way [older generations] think of our cable programs on TV." If a young person watches a right-wing video out of curiosity, she noted, the autoplay feature on the site will immediately queue up a related video. Before you know it, that viewer has consumed "five or six videos with those same points being repeated over and over again," making the arguments seem more acceptable, or at least less outlandish, than they otherwise would. 

Recruitment matters. So does money — perhaps even more. As Ryan put it, "many of these guys have built their profiles largely on video," and generating ad revenue on YouTube is a primary source of income. 

“These people that make YouTube content full-time with a white nationalist or conspiracy-theory flavor to it," Holt explained, now face "a threat to the way of living they’ve created for themselves.”

Ad revenue on pre-recorded video clips isn't the only source of alt-right revenue on YouTube, as Holt explained.

"People sympathetic to the 'alt-right' or white nationalist identities have been using live-streaming on YouTube to subvert community guidelines, because it’s hard to moderate something in real time," he explained. "During live-streaming, content creators have access to a feature called superchats where live viewers can donate money to have questions or comments pinned to the top of the live chatroom next to the video. It’s a way for people who are hosting extremist content to turn a profit.”

Holt believes the biggest blow to the "alt-right" so far has been the suspension of live-streaming privileges for a number of prominent "alt-right" figures, including Baked Alaska, Andy Warski, and Jerome Corsi. That sparked this semi-literate but fully enraged Twitter response from Corsi.

corsi

 

Ultimately, YouTube's disciplinary behavior falls well short of what's actually needed to combat the "alt-right" problem on their service. While the company has issued a number of strikes, taken down a few videos and, perhaps most important, cut off the "superchat" funding streams for a handful of figures, most "alt-right" vloggers are still free to post videos full of far-right ravings and outright lies.

Ryan believes that the focus on individuals is part of the problem, and suggested a more systematic approach, starting with the end of autoplay videos, at least for minor users. She also proposed that YouTube should build tools that would target and take down "botnets and sock-puppet accounts" that the "alt-right" creates by the hundreds to game YouTube and raise its collective profile. 

Ryan further noted that "we knew ahead of time that conspiracy theories were going to spread" after the Parkland shooting, because that's how "alt-right" trolls respond to every major mass shooting. When major news events like that occur, Ryan would like to see proactive efforts to keep conspiracy theories off YouTube in the first place, rather than a piecemeal campaign to take them down after they've gained an audience. 

But while YouTube could certainly do more to slow down the "alt-right" invasion, this entire debacle should be a warning sign to journalists and activists as well. While the political class has been chattering about the impact of Facebook and Twitter — services those people are more likely to use on a regular basis — the "alt-right" has been quietly using YouTube to recruit, raise money and grow. The service provides both a robust and rapt audience of young people, while remaining nearly invisible to those who could best debunk the lies and offer counter-arguments. It's a space that liberal and progressive commentators could and should enter more aggressively to disrupt the "alt-right's" hegemony, because the best remedy for lies is always truth.


By Amanda Marcotte

Amanda Marcotte is a senior politics writer at Salon and the author of "Troll Nation: How The Right Became Trump-Worshipping Monsters Set On Rat-F*cking Liberals, America, and Truth Itself." Follow her on Twitter @AmandaMarcotte and sign up for her biweekly politics newsletter, Standing Room Only.

MORE FROM Amanda Marcotte