Business

TikTok pushes ‘toxic’ misogynistic videos like ‘understanding the female narcissist’ to teen boys: study

China-owned TikTok’s algorithms are helping “hateful ideologies and misogynistic tropes” to become “normalized” in schools by flooding teenage boys’ feeds with disturbing negative videos about women, according to the alarming results of a British study.

Researchers found a “fourfold increase in the level of misogynistic content being presented on the ‘For You’ page” after just five days for test accounts set up to mimic the habits of disaffected young males.

The analysis was conducted by professors at the University College London and the University of Kent.

Troubling videos fed to the accounts include posts on “how to deal with disrespectful women,” “understanding the female narcissist,” negative views on “the truth about female nature” and one urging the view, “don’t chase women, chase money,” according to examples cited in the study.

“In this way, toxic, hateful or misogynistic material is pushed to young people, exploiting adolescents’ existing vulnerabilities,” the researchers said. “Boys who are suffering from poor mental health, bullying, or anxieties about their future are at heightened risk.”

Videos classified as “misogynistic content” — such as those that objectified or discredit women — jumped to 56% from 13% over the five-day period, the study said.

Researchers watched more than 1,000 videos over a seven-day period.

TikTok’s recommendation feed bombarded test accounts with misogynistic content, according to researchers. SOPA Images/LightRocket via Getty Images

The experts said their findings suggest an issue across all social media, not just TikTok, and highlight the need to cultivate a “healthy digital diet” that nudges young people to think critically about the “toxic online material” they see.  

They also call for Big Tech firms to be held “accountable” for “harmful algorithmic processes.”

TikTok, owned by Beijing-based Bytedance, pushed back on the study’s findings, arguing the report relied on a limited sample size and that examples of misogynistic content were not shared with its safety teams for review.

“Misogyny has long been prohibited on TikTok and we proactively detect 93% of content we remove for breaking our rules on hate,” a TikTok spokesperson said in a statement.

“The methodology used in this report does not reflect how real people experience TikTok and we work to ensure our community can enjoy a wide-range of content and has the tools to create the right TikTok experience for them,” the spokesperson added.

TikTok CEO Shou Zi Chew was recently grilled by a Senate panel. Getty Images

The study mentions the influence of figures such as controversial social media personality Andrew Tate, who built a massive following on TikTok and other platforms while peddling “toxic” views toward women.

Tate was permanently banned from TikTok and other social media platforms in 2022.

The study cited one young person who commented that “men are oppressed” and “isolated” and said he finds “some sort of solace in guys like Andrew Tate.”

In 2022, Tate was arrested in Romania and charged by local authorities with rape and human trafficking.

Tate, who awaits trail after release from house arrest last year, has denied the charges.

The researchers created four account “archetypes” based in part on longform interviews with young people who were found to be “engaging in radical online misogyny” and recruited on the social media platform Discord, an online discussion hub.

Using a factory-reset iPad, the researcher would watch TikTok videos for seven straight days as if they were an individual who fit one of the four archetypes.

Videos that wouldn’t “interest” the user were skipped.

TikTok said it moves quickly to remove videos that violate its policies. Getty Images

The four “archetypes” were TikTok users experiencing loneliness; users focused on “development of mental health knowledge and neurodiversity”; users focused on masculinity and dating advice; and users who are “more aware of some generalized men’s rights content.”

“Algorithmic processes on TikTok and other social media sites target people’s vulnerabilities—such as loneliness or feelings of loss of control—and gamify harmful content,” said UCL’s Kaitlyn Regehr, the study’s principal investigator. “As young people microdose on topics like self-harm, or extremism, to them, it feels like entertainment.”

Critics have long accused TikTok of pushing disturbing content through its murky recommendation algorithm – with one report last year alleging some teens are served a constant stream of videos related to suicide, anxiety, and depression.

The app’s failure to crack down on disturbing content also surfaced during TikTok’s recent high-profile spat with Universal Music Group — which yanked access to its library of some 4 million songs from stars such as Taylor Swift and boygenius from the app after talks on a new licensing deal collapsed.

Universal said in an open letter that TikTok had failed to crack down on copyright infringement “let alone the tidal wave of hate speech, bigotry, bullying and harassment on the platform” and accused the company of trying to “bully” its way to a below market-value deal.

TikTok blasted Universal’s claims as a “false narrative.”

Elsewhere, Grammy Awards host Trevor Noah blasted TikTok during the show for “ripping off all of these artists.” Universal posted a clip of Noah’s remarks on TikTok, where it has been viewed nearly 900,000 times.

In March 2023, TikTok CEO Shou Chew was pilloried on Capitol Hill about the effects of harmful content – including the death of 16-year-old Chase Nasca, who died by suicide after he was allegedly bombarded with videos related to depression and self-harm.

Nasca’s parents have since filed a wrongful death suit against TikTok parent ByteDance.

Chew was also targeted for criticism during his appearance at last week’s tense Senate hearing on online child exploitation and sex abuse.