How YouTube became a powerful far-right propaganda organ

A new report reveals how immensely effective YouTube is at spreading far-right propaganda

By Nicole Karlis

Senior Writer

Published September 18, 2018 6:00PM (EDT)

 (Shutterstock)
(Shutterstock)

YouTube, long under fire for its role in disseminating far-right propaganda, was recently lauded for its decision to remove fringe conspiracy theorist Alex Jones’ page from its platform. Yet a new report reveals the depths to which the Google-owned online video platform is adept at propagating far-right propaganda, running the gamut from white supremacy to racist alt-right ideologies.

A new report published on Tuesday by Data & Society Research Institute, an independent nonprofit, brings to light an entire network identified as the Alternative Influence Network (AIN), which, via the report, is defined as a network of 65 political influencers across 81 channels who profit from broadcasting their views. Many of these influencers openly support racism, misogyny, and white nationalism on the platform. Researcher Rebecca Lewis of Data & Society penned the report, which was published on Tuesday.

“The platform’s motto, ‘Broadcast Yourself,’ encourages individuals to build audiences and promote themselves outside of the confines of legacy media outlets,” Lewis explains in the white paper.  “YouTube also provides financial incentives for individuals to broadcast and build audiences.”

Specifically, Lewis points to a partner program the platform has, called the Youtube Partnership Program (YPP), which is accessible to content creators who have received more than 4,000 “watch hours” in one year, and who have at least 1,000 subscribers.

“YouTube gives these content creators a small proportion of advertising revenue for the videos they post (YouTube keeps the rest),” Lewis explains. “Content creators can also relay their popularity on YouTube into monetary gains on other platforms.”

As a result — and with the help of outside sources like Patreon which allows YouTubers to solicit donations — content creators can turn their YouTube channels into lucrative careers. While YouTube and YPP were not designed explicitly to fund fringe ideologues like Alex Jones and his wannabes, they have inadvertently helped their cause.  As I have previously written, YouTube’s incentivized creator programs likely enable sensationalist and oft-controversial YouTube stars like Logan Paul, too.

Lewis manually collected data between January 1, 2017 and April 1, 2018, and discovered influencers via what she described as a “snowball approach.”

“For each guest on an influencer’s channel, I would visit their own channel (if one existed) to see who they, in turn, hosted,” she explains, noting that “the boundaries of this network are loose and constantly changing.”

READ MORE: Bernie Sanders, Rep. Ro Khanna introduce “Stop BEZOS Act”

Some notable right-wing figures that have made the rounds on the informal "Alternative Influence Network" include Richard Spencer, Milo Yiannopoulos, and Blaire White.

The report is alarming given YouTube’s audience and ability to reach large swaths of the American population. According to a 2018 Pew Research Center report, 73 percent of Americans visit YouTube; 94 percent of those Americans are 8- to 24-year-olds. As Lewis notes, while extremist alt-right content is often thought to be found in “dark corners of the internet,” the fact is that much of it is happening on mainstream platforms like YouTube.

“This report has shown how these attempts at objectivity are being exploited by users who fundamentally reject objectivity as a valid stance,” the report concludes. “As a result, platforms like YouTube have an imperative to govern content and behavior for explicit values, such as the rejection of content that promotes white supremacy, regardless of whether it includes slurs.”

YouTube said in a statement that its users are subject to their “Community Guidelines,” which they “enforce rigorously.” Lewis concluded in the report it is indeed YouTube’s responsibility to govern its platform. The community guidelines prohibit nudity, violent and graphic content.

“Additionally, we’ve made updates over the past year to tighten our monetization policies and improve our enforcement against hate speech,” a YouTube spokesperson told Salon in a statement. “Since this research concluded in April 2018, we’ve made more updates to which channels have access to monetization features and deployed advanced machine learning technology to tackle more hate speech in comment features."

Today's hottest topics

Check out the latest stories and most recent guests on SalonTV.


By Nicole Karlis

Nicole Karlis is a senior writer at Salon, specializing in health and science. Tweet her @nicolekarlis.

MORE FROM Nicole Karlis