"Virtual terrorism": Far-right trolls are targeting marginalized groups on Zoom calls

An increase in group videoconferencing has given white nationalists leverage to harass and spread hate

By Nicole Karlis

Senior Writer

Published May 29, 2020 6:33PM (EDT)

Woman in front of a device screen in video conference (Getty Images)
Woman in front of a device screen in video conference (Getty Images)

On May 14, thirty-one residents of an East Oakland neighborhood joined a videoconference call to meet with their neighborhood services coordinator to hear updates about upcoming community events and resources available to residents; the meetings, which took place regularly in person prior to the pandemic, recently transitioned to virtual videoconferencing app Zoom. Then, five minutes into the call, the number of attendees jumped up to 72.

The newly uninvited guests quickly overtook the meeting — first, by chanting the n-word; then by taking control of the screen. The trolls drew swastikas and displayed pornography images for all to see.

"Then we restarted and it happened again — and it was just within less than 10 seconds that we'd hung up," Marla Williams, who is the chairperson of Oakland Neighborhood Crime Prevention Council's Havenscourt neighborhood, told Salon. "I ended up with about 14 or 15 attendees, once we got it all under control."

This wasn't the first time Williams witnessed this kind of virtual attack.

"It happened while I was on a call with the mayor before," Williams said. "It's very unfortunate, and it's very important that there's still this type of this type of mentality out there."

There have been hundreds of reports of far-right and white nationalist trolls, organized or otherwise acting alone, virtually attacking Zoom meetings across the country — chanting slurs, posting hate symbols, even displaying child pornography. In many cases, African Americans were specifically targeted. Because of the nature of Zoom's user interface, shutting out or kicking out trolls is not intuitive nor easy, particularly in public meetings; often, an expelled participant can merely join again immediately under a different alias and resume harassment.

At the University of South Carolina, the African American Association of Students' annual cookout on Zoom was interrupted by entrants chanting racial slurs and raising a Nazi flag. A congregation in Massachusetts was interrupted by footage of a Ku Klux Klan meeting that showed a cross burning. Academic institutions, left-wing political parties, and a Holocaust memorial meeting have all been virtually attacked, too.

While this nascent phenomenon has been dubbed "Zoom bombing," some question whether this wording detracts from the gravity of the situation, making it sound like a prank rather than digital terrorism. In an April report by the New York Times, an investigation found several active message boards on Reddit and 4chan where people were gathering to organize the Zoom attacks.

"What we've learned ... is that this is not 'oh, kids being bored,' and deciding to be destructive," Jade Magnus-Ogunnaike, campaign director of progressive nonprofit civil rights advocacy organization Color of Change, told Salon. "This is something that's very typically happening to marginalized groups in the United States, and it's organized." Magnus-Ogunnaike noted that Color of Change had specifically studied white nationalist and hate groups' online behavior.

Indeed, evidence that suggests these attacks are being conducted by organized white supremacists. On March 24, 2020, a webinar hosted by a Jewish student group was interrupted by a man who pulled his collar down to reveal a swastika tattoo on his chest. A day later, a similar event occurred during an online class hosted by a Jewish Community Center. The Anti-Defamation League (ADL) Center on Extremism studied both screenshots and believes infamous white supremacist hacker Andrew Alan Escher Auernheimer, also known as "weev," was behind the attacks.

"We know of some actual known extremists who have engaged in Zoom bombing," Oren Segal, vice president of the Anti-Defamation League Center on Extremism, told Salon. "It's something that we think law enforcement should be made aware of and we've recommended to anybody that they need to let local law enforcement know what the problem is." Segal explained that "there are some limits to what law enforcement can do" in the case of Zoom terrorism, often because the perpetrators do not show their faces nor use their real names. 

Segal added that there is a difference between organized extremists and trolls, but for the recipients of this hate, it doesn't matter.

"The impact of creating fear and anxiety and disrupting, and terrorizing people is the same," Segal said.

When asked if Zoom terrorist attacks are rising, Segal said hate online is nothing new, "but what we're seeing with the Zoom bombings is an extension of the exploitation of technology to spread hatred and harassment."

Salon spoke with another person, who asked to remain anonymous for fear of her political organization being targeted again, whose meeting was disrupted on two separate, and different, occasions. The first time was in early spring.

"It was definitely white nationalists," she told Salon. "They first came on with sound and it was just the n-word to a beat, the word over and over again, that happened first then we muted them and they took themselves off of mute."

"It really was, you know, super disturbing to the meeting and threw it off and, and just really upset people and upset everybody," she added.

In the second instance, during a call with 700 people, an interloper took over their screens to broadcast child pornography. The group reported the incident to both Zoom and to the authorities via the National Center for Missing and Exploited Children. However, the group hasn't heard back from either Zoom or law enforcement.

On March 30, the The Federal Bureau of Investigation issued a warning about cyber terrorism attacks, stating that the agency had received "multiple reports of conferences being disrupted by pornographic and/or hate images and threatening language." On May 20, the FBI warned that the agency has received more than 195 reports of attacks throughout the United States and in other countries in which Zoom meetings were disrupted with child sexual abuse material.

Color of Change has been leading a campaign directed at Zoom since March, when one of the organization's members was attacked while giving his Ed.D dissertation.

"A moment I had worked my whole life for turned into a nightmare because Zoom didn't prioritize my security," Dr. Dennis Johnson said in a press statement. "I want Zoom to know that my experience wasn't an outlier, but part of a broader pattern of racist and misogynist hate Black users experience on Zoom."

Dr. Johnson created a petition to demand Zoom to develop solutions to protect marginalized groups on its platform. One of the demands, hiring a chief diversity officer, was met this week. However, Color of Change is demanding more safeguards be implemented, including community agreements and warning messaging that precedes one entering a meeting.

"Color of Change not only drove the fight for them to create a Chief Diversity Officer position —and we're happy that that happened — but Zoom has a very long way to go in order to be a safe and usable space for Black communities and other marginalized communities," Magnus-Ogunnaike, Color of Change's campaign director, told Salon.

Magnus-Ogunnaike said she suspects the anonymous nature of the web is emboldening white nationalists to terrorize marginalized communities online.

A spokesperson for Zoom said the company has been "deeply upset" about these "types of incidents" and the company "strongly condemns such behavior."

"We take meeting disruptions extremely seriously and where appropriate, we work closely with law enforcement authorities," the statement said. "We encourage users to report any incidents of this kind to Zoom and law enforcement authorities so the appropriate action can be taken against offenders."

Segal agreed that companies like Zoom need to be thinking in advance about how their platforms can be weaponized to exploit hate and harassment.

"I think what is particularly damaging about it is that is at a time where there's extra fear and anxiety and uncertainty, and the need to connect with people, many who are isolated is more important than ever before," Segal says. "It's that element of like 'Are there no safe spaces at all anymore?' By the way, that's exactly what extremists do, right? They never miss an opportunity to double down and leverage a crisis."


By Nicole Karlis

Nicole Karlis is a senior writer at Salon, specializing in health and science. Tweet her @nicolekarlis.

MORE FROM Nicole Karlis


Related Topics ------------------------------------------

White Nationalists Zoom Bombing Zoom Terrorism