SELECT LANGUAGE BELOW

Violent online content ‘unavoidable’ for UK children, Ofcom finds | Internet safety

Violent online content is now ‘inevitable’ for UK children, with many first exposed to it while still in primary school, media watchdog finds did.

All of the British children interviewed in the Ofcom investigation said that the incidents ranged from videos of local school and street fights shared in group chats to explicit and extreme graphic violence, including gang-related content. I was watching violent content on the internet.

Children knew that more extreme material existed deep within the web, but they did not seek it out on their own, the report concluded.

Following the findings, the NSPCC accused tech platforms of being lax and “ignoring their duty of care to young users”.

Rani Govender, senior policy officer for online child safety, said: “Children are increasingly telling us that unintentional exposure to violent content has become part of their online lives.” That’s very worrying.”

“It is unacceptable that algorithms continue to push out harmful content that we know can have a devastating mental and emotional impact on young people.”

This study families, children, youth This is part of Ofcom’s preparations for new responsibilities under the Online Safety Act passed last year, which gave regulators powers to crack down on social networks that fail to protect users, particularly children.

Gil Whitehead, director of Ofcom’s Online Safety Group, said: “Children should not feel that seriously harmful content, including depictions of violence or promotion of self-harm, is an inevitable or inevitable part of their online lives.

“Today’s research sends a powerful message to technology companies that now is the time to take action to ensure they meet their child protection obligations under new online safety laws. We will consult on how we expect the industry to ensure children enjoy safer online experiences at their age.”

Children and young people interviewed by Ofcom mentioned almost every major technology company, but Snapchat and Meta’s apps, Instagram and WhatsApp, were mentioned most often.

“Children are learning how private, often anonymous accounts exist for the sole purpose of sharing violent content (most commonly local school and street fights). explained,” the report states. “Nearly all children who interacted with these accounts in this study reported finding them on Instagram or Snapchat.”

“There’s peer pressure to act funny,” said an 11-year-old girl. “Inside they feel uncomfortable, but on the outside they pretend to be funny.” Another 12-year-old girl said she felt “mildly traumatized” after being shown videos of animal abuse. “Everyone was joking about it,” he said.

Many of the older children in the study “appeared to have become desensitized to the violent content they encountered.” Experts also expressed particular concern that violent content normalizes violence offline, reporting that children are more likely to laugh and joke about serious violent incidents.

In some social networks, there are revelations of graphic violence from the upper echelons. Twitter, now known as X after being acquired by Elon Musk, on Thursday removed a graphic clip purporting to show acts of sexual mutilation and cannibalism in Haiti after it went viral on the social network. The clip was reposted by Musk himself, who tweeted it on news channel NBC. Report by channel He accused him and other right-wing influencers of spreading unverified claims about turmoil in the country.

Other social platforms offer tools to help children avoid violent content, but there is little help. Many 8-year-olds told researchers they could report content they didn’t want to see, but lacked trust that the system would work.

In the case of private chats, the platform has significant repercussions for those who post violent content, as they fear that reporting will result in them being marked as a “snitch” and facing embarrassment or punishment from colleagues. I didn’t believe that.

The rise of powerful algorithmic timelines like TikTok and Instagram has added another twist. Among children, spending even a little time with violent content (for example, while reporting it) is recommended.

Research experts expressed concern that the violent content was affecting children’s mental health. In a separate report published on Thursday, the Children’s Commissioner for England revealed that more than 250,000 children and young people are waiting for mental health support after being referred to NHS services. This means one in 50 children in the UK is on a waiting list. The average wait time for children who accessed the service was 35 days, but last year nearly 40,000 children experienced a wait time of more than two years.

A Snapchat spokesperson said: “There is no place for violent content or threatening behavior on Snapchat. When we see this type of content, we quickly remove it and take action against offending accounts. We will take appropriate measures.

“We have an easy-to-use and confidential in-app reporting tool and are working with law enforcement to support their investigations. We support and continue to engage constructively with Ofcom on enforcement of the law.”

We have reached out to Meta for comment. X declined to comment.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News