TikTok's algorithm is pushing out extremist and violent content to 13-year-olds

"There is this kind of an opportunity for those incel manosphere influencers to tap into, to young boys especially"

By Areeba Shah

Staff Writer

Published March 25, 2023 6:00AM (EDT)

Teenage boy is using smartphone at home in bed (Getty Images/EMS-FORSTER-PRODUCTIONS)
Teenage boy is using smartphone at home in bed (Getty Images/EMS-FORSTER-PRODUCTIONS)

TikTok's algorithm is flooding teens with extremist content, including videos explicitly promoting suicide and violence, according to new research by corporate accountability group Ekō.

Even low-level engagement with harmful content for 10 minutes can lead to accounts encountering videos that promote suicide and violence, the report found. Oftentimes such content can influence users to take action, which can result in real-world violence.

Ekō set up nine different accounts listing the ages as 13 years old, which is the youngest age users can join the platform. After engaging with videos either promoting or suggesting suicide, the accounts' For You Pages surfaced videos of guns being loaded and text suggesting suicide.

"Toxic algorithms force users down a deadly spiral and keep them there by feeding them harmful content," said Ekō campaigner Maen Hammad. "This is especially concerning when we think about kids, who require important safeguards while navigating an online life. Even more alarming is how easy it is for kids to fall into this spiral."

Hashtags that held suicide content readily available for children on the platform amassed over 1.43 million posts and 8.8 billion views, their research found.

Even though much of the content that's being pushed out to teenagers breaches TikTok's own community guidelines, little has been done to limit it from spreading. 

The platform does not allow users from sharing content depicting suicide, involving suicidal ideation or behavior and it also prevents the spread of hateful ideologies that praise, promote, glorify, or support misogyny, according to TikTok's community guidelines.

And yet, this type of content continues to circulate on TikTok, where two-thirds of American teenagers congregate.

"The experiment this report is based on does not reflect genuine behaviour that we see on TikTok," a TikTok spokesperson said in a statement to Salon in response to the Ekō report. "The safety of our community is a top priority, and we work hard to prevent the surfacing of harmful content on our platform by removing violations, publishing regular transparency reports,  adding permanent public service announcements on certain hashtags, and surfacing helpful resources with guidance from experts. We also balance these efforts by supporting people who come to TikTok to share their personal experiences in a safe way, raise awareness and find community support."

"We have to stop treating some of this content like it's free speech," said Jill Garvey, chief of staff at Western States Center. "They should be following their own guidelines for removing harmful content and they should be putting up those guardrails so that people who are so young and vulnerable can't access that."

Ekō's researchers identified a network of harmful suicide, incel and drug content being easily accessible to a 13-year-old account, some of which can be found in as little as three clicks. 

"We know that when people are exposed to this content online, younger folks are more likely to translate that to some sort of real-life harm," Garvey said.

In several instances, extremist or violent content spreads quickly on social media platforms, gaining thousands of views before it is taken down.

More than 25 videos and accounts were taken down from the platform after Ekō flagged them for breaking the community guidelines, Hammad said. Some of these videos had been up for months and amassed millions of views, "highlighting real failures to protect TikTok's users."

Since automated content moderation is challenging for platforms to incorporate, much of the moderation relies on human moderators, whose time and bandwidth can be limited, pointed out Gianluca Stringhini, a professor at Boston University who studies cybersecurity and online safety.

"Recommendation algorithms are designed to maximize the time that users will spend on the platform and their engagement, and unfortunately that type of content fits the bill," Stringhini said. 


Want a daily wrap-up of all the news and commentary Salon has to offer? Subscribe to our morning newsletter, Crash Course.


Even as platforms are doing their best to remove and demote content that does not comply with their terms of service, some of the most dangerous types of content are still spreading online.

TikTok is filled with content related to the incel and "manosphere" communities that are easily accessible for a 13-year-old account, Ekō found. 

Incels, or "involuntary celibates," are an online community comprised of men who blame women and society for their lack of romantic and sexual success. A subset of the online misogynist manosphere, incels believe that women have too much power and ruin their lives by rejecting them.

"Incels are the most violent sector of the manosphere, and have perpetrated a range of deadly attacks against women," according to the Anti-Defamation League.

Ekō found that TikTok was full of videos celebrating individuals popular in these communities, like Elliott Rodgers, who killed six people in a stabbing and shooting spree in Isla Vista, California, in May 2014, and Andrew Tate, a retired kickboxer and online influencer who was arrested on allegations of human trafficking, rape and organized crime.

"There's a whole economy built up around this," said one researcher who spoke with Salon on background. "There are a lot of influencers who have just kind of made their bones by speaking to male resentment."

This became especially damaging during the coronavirus pandemic when younger users lacked access to social interaction, the researcher pointed out.

"There is this kind of an opportunity for those incel manosphere influencers to tap into, to young boys especially because [younger users] are not hearing any kind of counter-arguments from people in their spheres," the researcher said. "They're seeing what they see online and it's very easy to find more of it."

TikTok offers an ecosystem of dark and depressing content around themes of death, toxic relationships, sexual abuse, mental health, and misogyny, Hammad said.

The platform's algorithm provided "highly misogynistic" content that amplified hateful messages and rhetoric suggesting "women are evil" and that "modern women are cowards," Ekō found. 

"What our research has shown is that the platform's algorithm does not differentiate between a cute cat video or a video offering at-home suicide concoctions," Hammad said. "So if a young user has even minimal engagement with a potentially harmful video, the algorithm will be triggered to serve up more of this content – no matter how dangerous it is."

By introducing some of these extremist ideas slowly, like suggesting women have more power than men in society, influencers like Tate are able to push users down the rabbit hole toward content that promotes violence.

Ekō found a network of videos that targeted young boys with highly misogynistic and oftentimes violent content. While other platforms like Reddit have recognized the toxicity of the incel movement and banned it from the platform, incel content is still widespread on TikTok. 

But this is not a TikTok problem, it is a Big Tech problem, Hammad argued. He added that banning the app won't change anything, but comprehensive legislation to hold all platforms including Instagram, YouTube, and Facebook accountable would.

As remedies, state lawmakers should pass the multiple Age Appropriate Design Code Bills to keep young people safe, Hammad suggested. He added that Congress also pass comprehensive laws to rein in Big Tech using the European Union's Digital Services Act as a blueprint.


By Areeba Shah

Areeba Shah is a staff writer at Salon covering news and politics. Previously, she was a research associate at Citizens for Responsibility and Ethics in Washington and a reporting fellow for the Pulitzer Center, where she covered how COVID-19 impacted migrant farmworkers in the Midwest.

MORE FROM Areeba Shah


Related Topics ------------------------------------------

Politics Reporting Tiktok