SELECT LANGUAGE BELOW

Meta adds new age-based restrictions to Facebook, Messenger to curb inappropriate content

My parents are planning to “love” this.

Meta has added “teen accounts” to Facebook and Messenger to limit who can contact minors and screen for content they are exposed to.

on tuesday, The tech giant announced Users under the age of 18 will automatically be registered with these accounts to “give more hearts for parents across the meta app” and to reduce exposure to inappropriate content.

On Tuesday, Tech Giant announced that users under the age of 18 will automatically register with these accounts. Meta

Meta He told TechCrunch Those teens only receive messages from those who have followed or received the message before. Only their friends can see and reply to their stories, tags, mentions and comments.

Also, teens will send notifications to close the app after recording an hour of screen time and place the app in “quiet mode” at night.

Users under the age of 16 must have parental permission to change settings.

These protections will be rolled out in the US, UK, Australia and Canada before they can be expanded elsewhere.

Similar safety features were added last year as Watchdog and lawmakers crack down on the lack of protection for children at social media companies amid concerns about an increase in mental health issues related to the app.

Users under the age of 16 must have parental permission to change settings. Meta

In addition to the recently added features to Facebook and Messenger, Instagram allows children to view accounts they have recently sent messages, set daily time limits, and block teens from using the app during certain periods.

In the latest update released Tuesday, Meta added protections that block teens under the age of 16 from going “live,” receiving “unwanted images” and failing to receive suspected images, including nudes.

Meta has maintained these built-in restrictions on accounts since they were first added last year, with 94% of parents saying these restrictions are “helpful.”

These protections will be rolled out in the US, UK, Australia and Canada before they can be expanded elsewhere. Antoniodiaz – stock.adobe.com

However, from the start of these changes, many online safety and parenting groups have argued that safety upgrades are inadequate.

Last summer, US surgeon General Vivek Murthy raised awareness of potential mental health risks such as depression and anxiety, seeking the implementation of tobacco-style “warning labels” for social media apps.

Similar safety features were added to Instagram last year as these apps and concerns over the rise in the mental health crisis have been strengthened, and Watchdog and lawmakers are cracking down on social media companies' lack of protection for children. prima91 – stock.adobe.com

Last fall, a coalition of state attorney general sued Meta, claiming the company relied on addictive traits to hook children at the expense of mental health and boost profits.

“Meta can be pushed in the same way as “” that focuses on many “children” or “teens.” It does not change the fact that its core business model is based on striving to and encouraging children and teens to become obsessed with their products.

Another watchdog, the Tech Transparency Project, claimed that Meta “invokes to implement it already over the years” of the version of the feature detailed in the first move.

For example, Meta originally announced plans to make teen accounts private by default and limit interactions with strangers until 2021, according to a previous blog post.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News