Despite Parler backlash, Facebook played huge role in fueling Capitol riot, watchdogs say

Deadly Capitol raid would have still happened without Parler, says Media Matters head — but not without Facebook

By Igor Derysh

Managing Editor

Published January 16, 2021 6:00AM (EST)

Mark Zuckerberg | capitol Riot (Photo illustration by Salon/Getty Images)
Mark Zuckerberg | capitol Riot (Photo illustration by Salon/Getty Images)

The far-right social media platform Parler has shouldered much of the blame for last week's Capitol riot — and may since have been rendered permanently defunct. But watchdog groups say much larger companies like Facebook carry more of the responsibility for the lead-up to the pro-Trump siege.

Amazon Web Services, which hosted Parler, took the platform offline last week after Apple and Google removed it from their app stores, arguing Parler was not doing enough to moderate content that could incite violence. Amazon in court documents detailed extensive violent threats on Parler that the company "systemically failed" to remove. Hacked GPS metadata analyzed by Gizmodo shows that "at least several" Parler users managed to penetrate deep inside the Capitol.

"From what I've seen, people were actually coordinating on Parler, logistics and tactics and things like that," Kurt Braddock, an extremism expert at American University and the author of "Weaponized Words," said in an interview with Salon. "That's a step beyond the pale. So Parler, in terms of planning and coordination, probably was the biggest player in terms of the social media environment."

Parler, which billed itself as a free-speech alternative to social networks that moderated posts and claims to have more than 12 million users, no doubt helped fuel last week's violence. But its role pales in comparison to social media behemoths like Facebook, which is used by nearly 70% of American adults, said Angelo Carusone, president and CEO of the watchdog group Media Matters.

"If you took Parler out of the equation, you would still almost certainly have what happened at the Capitol," he told Salon. "If you took Facebook out of the equation before that, you would not. To me, when Apple and Google sent their letter to Parler, I was a little bit confused why Facebook didn't get one."

Larger companies were eager to single out Parler to avoid the "potential legal implications" from "associating yourself with an app or platform that is encouraging and inviting actions that will lead to violence," said Yosef Getachew, director of the media and democracy program at the watchdog group Common Cause.

Parler played a role in the "organizing" of the siege and amplified calls to violence but "it wasn't just Parler, it was social media platforms across the board," Getachew said. Facebook in particular has "done a poor job of consistently enforcing their content moderation policies," he added.

This isn't just a case of "one platform is a bad actor," Getachew said. "All platforms have not done what they need to do to prohibit this type of disinformation and incitement of violence."

Sheryl Sandberg, Facebook's chief operating officer, has sought to deflect blame to other social networks following last week's siege.

"We again took down QAnon, Proud Boys, Stop the Steal, anything that was talking about possible violence last week," Sandberg said in an interview with Reuters on Monday. "Our enforcement is never perfect, so I'm sure there were still things on Facebook. I think these events were largely organized on platforms that don't have our abilities to stop hate, don't have our standards and don't have our transparency."

But available data suggests that Facebook played a much larger role than Sandberg suggested. As many as 128,000 people used the #StoptheSteal hashtag promoted by Trump and his allies until Monday, Eric Feinberg, a vice president with the Coalition for a Safer Web, told The Washington Post. At least two dozen Republican officials and organizations in at least a dozen states used the social network to plan bus trips to the rally that preceded the riot, according to a Media Matters analysis. Media Matters also identified at least 70 active Facebook groups related to "Stop the Steal," against which the platform could have acted long before the riot. Days after the siege, Facebook's algorithm was still suggesting events hosted by some of the same groups that organized the Stop the Steal rally.

These groups didn't just spread misinformation but actively "encouraged people to attend the riot last week and to potentially arm themselves and to potentially engage in other violent acts," Getachew said. "These are the types of things from a public interest side that make it harder to monitor because the groups are closed, right? You need permission to enter and Facebook isn't doing a good enough job of actually facilitating or moderating these groups to prohibit this type of content, or to ban these groups altogether."

"To date, we've banned over 250 white supremacist groups and have been enforcing our rules that prohibit QAnon and militia groups from organizing on our platform," a Facebook spokesperson said in a statement to Salon. "We work with experts in global terrorism and cyber intelligence to identify calls for violence and remove harmful content that could lead to further violence. We are continuing all of these efforts and working with law enforcement to prevent direct threats to public safety."

Conservatives have repeatedly accused Facebook of censorship even though leaked materials obtained by NBC News show that the company has gone out of its way to ease its false information policy for conservative pages over concerns about "bias" claims. An analysis by The Washington Post found that about 40% of the top 10 performing Facebook posts on any given day between the November election and the Jan. 6 riot were from right-wing personalities and media, and another 15% were from Trump, his campaign or his administration. National and local media outlets made up about a quarter of the top posts — and left-wing accounts barely made a blip.

Facebook's algorithm has also placed ads for body armor, gun holsters and other military equipment next to content promoting election misinformation and the Capitol riot, according to BuzzFeed News.

Facebook previously came under fire for failing to crack down on extremist content ahead of the deadly 2017 Charlottesville white nationalist rally. It was used to organize numerous protests against coronavirus restrictions earlier this year, including an armed invasion of the Michigan state capitol. Facebook later removed certain pages linked to the Charlottesville rally and announced plans to remove thousands of QAnon-related accounts. These actions have all been "too little, too late," Getachew says.

Braddock believes Parler's role is different than that of Facebook, however, because "it went beyond just rhetoric."

"The other social networks … have groups where people can go and discuss topics related to Trump and the election and things like that, but from what I've seen Parler was the key player in not only perpetuating the rhetoric … and serving as an amplifier for it but even planning the attack itself," he said. "So if we're developing a hierarchy of culpability for this, I think Parler is at the top of that list."

Carusone argued that Facebook "had a much bigger role" in the riot, noting that Media Matters and others "brought to their attention" numerous "red flags" they spotted in the lead-up to the riot, but Facebook managers "still didn't do anything about it."

"Apple and Google were being extraordinarily myopic and, frankly, hypocritical in singling out Parler," he said. "Not because I want to defend Parler, but the math is the math. Facebook was worse."

Numerous social networks, including Twitter, have permanently banned President Trump in the wake of the riot. Facebook CEO Mark Zuckerberg said the company would suspend the president at least until President-elect Joe Biden's inauguration next week.

Carusone called on Facebook to extend the ban permanently.

"Facebook has done all these performative things," he said. "We're giving Facebook far too much credit. We're letting them play sleight of hand. Their ban for Trump wasn't even a ban. They came out and issued a two-week suspension. … There's still this open question of, if the temperature dials back, do they let Trump back on? I think that fight and that conversation is going to be very different when we're three or six months removed from this event."

Sandberg told Reuters that the network has "no plans to lift" Trump's ban.

"This showed that even a president is not above the policies we have," she said.

Carusone predicted that Facebook will likely "backslide" because "they've done it every time … when the heat is off." He added that Facebook needs to expand its policies on moderating closed groups and expand their threat detection beyond content on its platform.

Getachew said that Facebook and others need to more consistently enforce their policies, and also expand them to more effectively combat disinformation and online voter suppression.

Braddock agreed that larger social networks like Facebook need to be better at "getting rid of disinformation on the platforms, because that's kind of the tie that binds all these groups together."

"The central theme in all this was 'the election was stolen,' and there's no evidence for it. But you can go on any social media platform right now and find any amount of information on that," he said. "So de-platforming is one thing … but I do think social media companies need to be better and faster at getting rid of disinformation that can have the kinds of effects we saw the other day."

Twitter, which served as a megaphone of hate for the president for years, has also faced blame for helping Trump and his allies spread misinformation. But as with Parler, its user base is a fraction of Facebook's or YouTube's. While YouTube is used by more than 70% of American adults, just 22% use Twitter, a smaller proportion than social networks like Snapchat and Pinterest, according to Pew Research.

Advocates have criticized Apple and Google, which owns YouTube, for their own roles in fueling misinformation. Media Matters reported on Wednesday that Apple Podcasts and Google Podcasts have failed to crack down on QAnon-related podcasts that celebrated the Capitol siege. And YouTube has long been criticized as a "radicalization engine" over its recommendation algorithm's propensity to push users toward increasingly extreme content.

"Google's role in all of this is … significant," Carusone said. Even more than Facebook, he said, "YouTube had the worst election disinformation policy."

A Media Matters analysis found that 47 of the top 100 YouTube videos about mail-in voting contained "misinformation" and "straight-up lies."

Facebook management "basically let it be a free for all," Carusone said. "They were very limited in terms of what they would enforce. They would demonetize some things, but their biggest problem was that they decided they were going to boost 'authoritative' content — but one of the sources they put in there as authoritative was Fox News."

Despite officially recognizing Biden's victory, Fox News has aired content suggesting that the election was stolen, undermined or involved in a conspiracy more than "600 times," Carusone noted.

Ivy Choi, a spokesperson for YouTube, said in a statement to Salon that the company has cracked down on election misinformation.

"Over the last month, we've removed thousands of videos claiming that widespread voter fraud changed the result of the 2020 election," Choi said. "In fact, many figures that were related to or participated in the violent attack on the U.S. Capitol had their channels terminated months prior, for violating our policies. Additionally, we're continuing to raise up authoritative news sources on our home page, in search results and in recommendations, and saw that the most viewed and recommended election-related channels and videos are from news channels like NBC and CBS."

Carusone pointed to misinformation from the ardently pro-Trump propaganda shop One America News Network, which has repeatedly gone far beyond even Fox News in pushing Trump's baseless election-fraud narrative.

"They didn't take any action to neutralize the effect of the virality of One America News' videos during that time period," he added. "Because of the nature of the content, you were falling into these rabbit holes where ... before long, you were getting the Lin Wood kind of crazy stuff." (Wood is an Atlanta attorney who has consistently echoed or amplified the most far-fetched, delusional and conspiratorial claims of Trump and his supporters.)

Researchers at Cornell University published a study last year examining YouTube's "right-wing echo chambers."

The study found "evidence for a small but growing 'echo chamber' of far-right content consumption," the researchers wrote. "Users in this community show higher engagement and greater 'stickiness' than users who consume any other category of content. Moreover, YouTube accounts for an increasing fraction of these users' overall online news consumption. Finally, while the size, intensity, and growth of this echo chamber present real concerns, we find no evidence that they are caused by YouTube recommendations. Rather, consumption of radical content on YouTube appears to reflect broader patterns of news consumption across the web."

YouTube says it has consistently removed videos from OAN that violate their policies, and OAN does not currently feature prominently in its recommendations nor does it appear in searches related to the election. All videos about the election now include a message noting that President-elect Joe Biden was the winner, and include a link to the Cybersecurity and Infrastructure Security Agency's "Rumor Control" page.

YouTube also removed more than 1.8 million channels in the third quarter of last year for violating policies regarding hate speech, harassment, incitement to violence, harmful conspiracy theories and presidential election integrity, the company reports, as well as tens of thousands of videos and hundreds of channels related to the QAnon conspiracy theory.

Despite YouTube's more proactive approach to dangerous material in recent months, it still needs greater "algorithmic transparency," Getachew said.

"These are systems that are being developed in a black box. Oftentimes the individuals who are developing these algorithms are homogeneous in that they are white men," he said. "They aren't even diverse in terms of other perspectives, to actually create algorithms where they won't lead you down these rabbit holes. We need diversity in developing these algorithms, but also we need transparency in how these algorithms are being developed, audits and other tests. ... The company shouldn't be looking for ways to maximize engagement by sending you more and more extreme content through algorithms."

Braddock said that YouTube employees have told him they are "aware" of this problem and are trying "to counter that as best they can."

"Something about YouTube that the other platforms don't have is that organizations in the counter-radicalization space have kind of taken advantage of that algorithm," he noted. "So if someone is looking at, say, ISIS videos, there are certain organizations that can embed videos that are counter-ISIS, that kind of hack the algorithm. So one benefit of the YouTube algorithm is that it can be used for the benefit of counter-radicalization. You don't really have that on something like Parler."

Carusone said it was striking that YouTube employees "themselves acknowledge" both the power and deficiencies of the recommendation engine, "because they felt the need to short-circuit it."

"Don't short-circuit it now. Fix it," he said. "YouTube [is] the one platform that probably needs to do the least amount of active enforcement by comparison to others. When YouTube makes changes to how things are monetized, and they start demonetizing stuff or cracking down on channels a little bit, creators understand that. They may complain, they may gripe, they may tear it apart. But the one thing they do is to ensure that the next video they put out doesn't fall victim to the new changes."

The social network crackdowns and the takedown of Parler has led to an explosion of new users to encrypted messaging apps like Signal and Telegram, sparking some concern that extremists will be able to now be able to hatch plots out of sight.

"Encrypted apps have their purpose in terms of protecting the privacy of users," Getachew said. "But that should not absolve companies from taking steps that prohibit the spread of disinformation, or at the very least taking steps so their platforms aren't being used to facilitate disinformation and other content that could lead to offline violence." 

"Other terrorist groups from around the world have gone to these encrypted apps," said Braddock. "None of this is good, but if there's a good thing that comes from moving to Telegram it's that it's much more difficult to coordinate large-scale events like Jan. 6 on an app like that than on a domain where many thousands of people can discuss in the same thread. So it becomes more difficult logistically, but it's problematic that there's a way for individuals like this to be able to plan in any capacity."


By Igor Derysh

Igor Derysh is Salon's managing editor. His work has also appeared in the Los Angeles Times, Chicago Tribune, Boston Herald and Baltimore Sun.

MORE FROM Igor Derysh