SELECT LANGUAGE BELOW

Meta to Hide Content from Teens about Self-Harm, Eating Disorders

Instagram and Facebook's parent companies announced this week that they will begin hiding content about suicide, self-harm, and eating disorders on teens' accounts, and will also begin placing teens in their “most restrictive content moderation settings.” Announced.

“Restricting teens from seeing certain types of content on Facebook and Instagram, even if it's from friends or people they follow,” Meta said in a blog post. Examples of such content include content that discusses self-harm or struggles with eating disorders, or that “contains restricted merchandise or nudity,” Mehta said.

Additionally, Meta said it “automatically places teens in the most restrictive content moderation settings on Instagram and Facebook.”

The changes will apply automatically unless teens lie about their age when signing up.

“We have already applied this [restrictive] It was set up for new teens joining Instagram and Facebook, and now we're extending it to teens who are already using those apps,” Mehta said. “Our content recommendation controls, known as 'Sensitive Content Controls' on Instagram and 'Reduce' on Facebook, make it more difficult for people to encounter potentially sensitive content and accounts in places like Search and Explore. It will be. ”

Mehta also aims to make changes to the themes of self-harm and binge eating for everyone from teens to adults.

“While we allow people to share content that talks about suicide, self-harm, and struggles with eating disorders, our policy does not endorse this content and provides information on how to make it harder to find. We have been focusing on this,” Mehta said. “From now on, when people search for terms related to suicide, self-harm, or eating disorders, we will hide these related results and direct them to specialized resources for help. We are currently hiding results for suicide and self-harm search terms that inherently violate the rules, but we plan to extend this protection to include more terms. This update will be available to all users in the coming weeks. It will be published on.”

But some critics said the change came too late.

“Today's announcement by Meta is yet another desperate attempt to circumvent regulation and an incredible slap in the face to parents who have lost children to online abuse on Instagram.” Josh Golin, executive director of Fairplay, an online children's advocacy group, told ABC News. . “If the company could hide content that promotes suicide and eating disorders, why did it wait until 2024 to announce these changes?”

In October, a bipartisan coalition of 33 attorneys general was launched. It has been submitted The federal lawsuit against Meta says that Meta “intentionally designed and introduced harmful features on Instagram, Facebook, and other social media platforms that were intentionally addictive to children and teens.” It is claimed that he did.

Photo credit: ©iStock/Getty Images Plus/kitzcorner


Michael Faust has covered the intersection of faith and news for 20 years. His story was published in Baptist Press. Christianity Today, Christian Poecent, of leaf chronicle, of toronto star and of knoxville news sentinel.

Related podcasts:

The views and opinions expressed in this podcast are those of the speakers and do not necessarily reflect the views and positions of Salem Web Network and Salem Media Group.

Related videos:

It might be helpful to consider how Biblical patterns influence our behavior today.read a book by james spencer Click here for the full article.

Audio and photo credit: ©/iStock/Getty Images Plus/skynesher

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News