Chinese-owned TikTok’s algorithm flooded teenage boys’ feeds with disturbing and negative videos about women, creating a “hateful ideology and misogynistic tropes” at school, according to a British study. It helps us become ‘normalized’, a surprising British study has found.
Researchers set up test accounts to mimic the habits of disaffected young men and found that after just five days, “the level of misogynistic content appearing on ‘For You’ pages quadrupled.” It was found that “increased.”
The analysis was carried out by professors at University College London and the University of Kent.
The offending videos sent to the account include posts about “how to deal with rude women,” “understanding female narcissists,” negative comments about “the truth about women’s nature,” and posts about “how to deal with disrespectful women,” and negative comments about “the truth about women’s nature.” It includes posts that promote the view, “Don’t chase money, chase money.” According to examples cited in the study.”
“In this way, toxic, hateful or misogynistic content is pushed onto young people and exploits their existing vulnerabilities.” the researchers said. “Boys who suffer from poor mental health, bullying and uncertainty about their futures are at increased risk.”
The number of videos categorized as “misogynistic content,” such as videos that target or discredit women, jumped from 13% to 56% in five days.
Researchers watched more than 1,000 videos over a seven-day period.
Experts say the findings point to problems common to all social media, not just TikTok, and encourage young people to think critically about the “harmful online content” they are exposed to. He emphasized the need to foster a “digital diet.”
They also want Big Tech companies to be held “accountable” for their “harmful algorithmic processes.”
TikTok, owned by Beijing-based ByteDance, denied the findings, saying the report relied on a limited sample size and that examples of misogynistic content were shared with its safety team for review. He claimed that there was not.
“Misogyny has long been prohibited on TikTok, and we actively detect 93% of the content we remove for violating our hate rules,” a TikTok spokesperson said in a statement.
“The methodology used in this report does not reflect how real people experience TikTok. We want our community to enjoy a wide range of content and create the TikTok experience that’s right for them. We are working to ensure we have the tools to create them,” the spokesperson added.
The study cites the influence of figures such as controversial social media personality Andrew Tate, who has amassed huge followings on TikTok and other platforms while promoting “harmful” views about women. .
Tate was permanently banned from TikTok and other social media platforms in 2022.
The study quoted one young man who commented that “men are oppressed” and “isolated” and said: “I find a certain comfort in men like Andrew Tate.”
In 2022, Tate was arrested in Romania and charged by local authorities with rape and human trafficking.
Tate, who was released from house arrest last year and is awaiting further investigation, denies the charges.
The researchers found that they “engaged in extreme misogyny online,” based in part on lengthy interviews with young people recruited on the social media platform Discord, an online discussion hub. We created “prototypes” of four accounts.
Researchers used factory-reset iPads to watch TikTok videos over seven consecutive days as if they were individuals who fit into one of four archetypes.
Videos that the user was “not interested in” were skipped.
The four “archetypes” were TikTok users experiencing loneliness. Users focused on “mental health knowledge and neurodiversity development.” Users focus on masculinity and dating advice. and users who are “more aware of generalized men’s rights content.”
“The algorithmic processes on TikTok and other social media sites target people’s vulnerabilities, such as feelings of loneliness and loss of control, and gamify harmful content,” said the study’s lead researcher. said Caitlin Reger from UCL. “It feels like entertainment for young people to subtly cover topics like self-harm and extremism.”
Critics have long accused TikTok of pushing disturbing content through opaque recommendation algorithms, and one report last year found that TikTok was linked to suicide, anxiety and depression among some teens. There were reports that videos were constantly being provided.
TikTok’s failure to police disturbing content also surfaced in a recent high-profile dispute between TikTok and Universal Music Group. After negotiations over new music, the app was stripped of access to a library of around 4 million songs by stars including Taylor Swift and Voygenious. The licensing agreement fell through.
Universal said in an open letter: TikTok has failed to crack down on copyright infringement, “not to mention the wave of hate speech, bigotry, bullying and harassment on its platform” and says the company is trying to “bully” the company into a below-market contract. he accused.
TikTok denounced Universal’s claims as a “false narrative.”
Elsewhere, Grammys host Trevor Noah accused TikTok of “ripping off all artists” during his show. Universal posted a clip of Noah’s remarks on TikTok. It has been viewed approximately 900,000 times.
In March 2023, TikTok’s CEO Shou Chiu announced that he had been accused of harmful behavior, including the death of 16-year-old Chase Nazca, who was allegedly bombarded with videos related to depression and self-harm. The impact of such content was exposed at the Capitol. .
Nazca’s parents then filed a wrongful death lawsuit against TikTok’s parent company, ByteDance.
Mr Chu also came under criticism during last week’s tense Senate hearing on online child exploitation and sexual abuse.
