SELECT LANGUAGE BELOW

Social media algorithms ‘amplifying misogynistic content’ | Social media

Algorithms used by social media platforms are rapidly amplifying extreme misogynistic content, spreading it from teenagers’ screens to school playgrounds and normalizing it, according to a new report. That’s what it means.

Researchers found that the level of misogynistic content suggested by TikTok quadrupled over a five-day monitoring period, as the algorithm served more extreme videos focused on anger and accusations directed at women. He said that it was detected.

Although this particular study focused on TikTok, the researchers said their findings likely apply to other social media platforms, and that addressing this issue requires “a potentially dangerous approach.” Rather than completely banning “expensive” mobile phones and social media, he called for a “healthy digital diet” approach. It becomes ineffective.”

The study, by a team from University College London and the University of Kent, comes amid renewed concerns about the impact of social media on young people. A survey last week found that young Gen Z men, many of whom look up to social media influencer Andrew Tate, are more likely than baby boomers to believe that feminism has caused more harm than good. It turns out that this is highly possible.

Meanwhile, Brianna Gee, the mother of a murdered teen, called for social media apps to be banned from smartphones for people under 16 after hearing evidence about her daughter’s killer’s online activities.

The UCL/Kent study is called ‘Safer Scrolling’ and claims that through social media’s algorithmic processes, harmful content is presented as entertainment. It said toxic, hateful or misogynistic content was being “forced” on young people, increasing the risk of boys suffering from anxiety and poor mental health.

Lead researcher Dr Caitlin Reger (UCL Information Research) said: “Harmful views and metaphors are becoming normalized among young people.” “Online consumption is influencing the offline behavior of young people, and we are seeing these ideologies seep from screens to schoolyards.”

Researchers interviewed young people who participate in and produce radical online content to create a number of archetypes of teenage boys who are potentially radicalized. Accounts with specific interests for each archetype may be set up on TikTok, seeking content related to masculinity and loneliness, researchers say. I watched more videos than books in 7 days.

Initially, the proposed content was consistent with each archetype’s stated interests, but after five days, researchers found that the TikTok algorithm was found to be misogynistic, including objectifying women, sexually harassing women, and degrading women’s trust. announced that the number of videos containing recommended content has increased from 13% to 4 times the recommended content. Up to 56% for videos.

“TikTok and other social media sites’ algorithmic processes target people’s vulnerabilities, such as feelings of loneliness and loss of control, and gamify harmful content,” Regehr said. “It feels like entertainment for young people to subtly cover topics like self-harm and extremism.”

Researchers also interviewed young people and school leaders about the impact of social media and found that hateful ideologies and misogynistic tropes are moving from screens to schools and becoming embedded in mainstream youth culture. found.

Geoff Barton, general secretary of the Association of School and College Leaders, which collaborated on the research, said: “The UCL findings show that algorithms, about which most of us know little, can have a snowballing effect leading to ever more extreme outcomes. “It shows that we are doing well,” he said. Content in the form of entertainment.

“While this is deeply concerning in general, the amplification of messages about toxic masculinity and the need for young people to grow up and develop their understanding of the world without being influenced by such horrific content. This is particularly worrying regarding its impact on

Skip past newsletter promotions

“We call on TikTok in particular and social media platforms in general to urgently review their algorithms and strengthen safeguards to prevent this type of content. We also call on the Government and Ofcom to We call on you to consider the implications of this issue under the auspices of the New Online Safety Act. ”

Andy Burrows, an advisor at the Molly Rose Foundation, set up in memory of Molly Russell, who committed suicide after falling into a spiral of despair on social media, said: “This study shows how TikTok’s algorithm affects young people.” “This evidence confirms that they are relentlessly targeting and engaging in aggressive attacks.” Harmful content, which can result in a near constant stream of videos that are unhealthy and even dangerous to young people within days.

“This is why regulator Ofcom needs to take bold and decisive action to tackle high-risk algorithms that prioritize the profits of social media companies over the safety and well-being of teenagers. It couldn’t be more obvious.”

Prime Minister Rishi Sunak, on a visit to Northern Ireland, said: That’s why I’m glad we passed the Online Safety Act last year. This means regulators now have powerful new powers to control what children are exposed to online.

“And if big social media companies don’t comply, regulators can impose very large fines on them, and our priority now is to ensure this law is enforced. .”

“Misogyny has long been prohibited on TikTok, and we actively detect 93% of the content we remove for violating our hate rules,” a TikTok spokesperson said. The methodology used in this report does not reflect how real people experience TikTok. ”

A spokesperson for Ofcom said: “Tackling violence against women and girls online is a priority for us. Our research shows that women are less confident in their personal safety online and are more likely to engage in harmful activities such as trolling. It has been found that people are more susceptible to content that is

“Under the Online Safety Act, online services such as social media and search services have a duty to protect the safety and rights of their users. It’s the center of it.”

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News