Meta finally restricts DMs from unknown adults to underage users

On Thursday, Meta finally enacted restrictions to prevent unidentified adults from directly contacting underage users. Critics say the move is long overdue amid growing concerns about child safety on Facebook and Instagram.

The new rules come as Meta faces sweeping challenges from 33 states accusing the company of fueling a youth mental health crisis, as well as new lawsuits alleging that Meta exposes underage users to sex charges. The revelations came Thursday as he faces a series of explosive lawsuits, including alarming charges brought by the state of Mexico. predator.

The new default settings will block teenage Instagram users from receiving direct messages or being added to group chats from accounts they haven't already connected to or followed, Meta said in a blog post.

This change applies to all U.S. users under 16 and to users under 18 in some other countries.

Meta is introducing a similar feature for teens in its Messenger app that blocks them from receiving messages unless they are already connected to the sender as a Facebook friend or through a phone contact.

“We want to give teens a safe and age-appropriate experience with our apps,” Mehta said in a blog post.

Meta now has more direct message restrictions for teens. Getty Images

In addition, parents will be able to approve or reject attempts to change the account safety settings of their children under 16 years of age. Previous versions of Instagram's parental monitoring feature only notified parents if their teen changed their safety settings.

Meta also said it would “protect teens from seeing unwanted and potentially inappropriate images in messages from people they're already connected to, and send these types of images themselves.” Another feature is also in the works. Meta said it “may have more to share” about its features later this year.

Meta executives, including Zuckerberg and Instagram CEO Adam Mosseri, recently vetoed or watered down proposed features aimed at protecting teens. News reports and lawsuits were reported, and a new push for safety began.

In one case, Mr. Zuckerberg reportedly vetoed an effort to ban filters that mimic the effects of plastic surgery, despite concerns that they were promoting body dysmorphia among teens.

The safety update is the latest of several changes made by Meta. meta

Josh Golin, executive director of the child safety advocacy group Fair Play, said Meta's announcement was long overdue.

“Today's announcement shows that Meta can indeed change Instagram's design to make it more secure by default,” Golin said. “But it shouldn't have taken more than a decade of Instagram plunder, whistleblower revelations, lawsuits, outraged parents, and Mark Zuckerberg being hauled before Congress to make this change.” .”

New Mexico Attorney General Raul Torrez said, “Parents are unsure whether Meta's latest policy changes are meaningful or whether Meta is likely to faithfully implement those changes.'' There is good reason to be skeptical.”

“Evidence shows that Meta consistently does the bare minimum when it comes to child safety on Facebook and Instagram,” Torrez said in a statement. . “While any progress is welcome, litigation should not have been necessary to ultimately prompt action to protect children on Meta's platform.”

Meta said it wanted its app to provide an “age-appropriate” experience. Getty Images

Critics of Mark Zuckerberg's social media site claim it uses addictive features such as rampant notifications and “like” buttons to keep young users hooked. Even though disturbing content on the platform promotes negative outcomes such as anxiety, depression, and body image issues, self-harm.

The New Mexico lawsuit reveals that an anonymous Apple executive once complained to Mehta that his 12-year-old child had been “solicited” on Instagram.

The complaint, which cited various company documents and communications, also detailed a 2021 internal presentation showing that “100,000 children per day were subjected to online sexual harassment, including photos of adults' genitals.” .

The lawsuit, filed by 33 state attorneys general, also includes a comprehensive review of Meta's response to the growing child safety crisis.

Meta is facing multiple lawsuits over the safety of its platform. Reuters

The states argued that Meta, contrary to its own internal investigation, had significantly downplayed the prevalence of self-harm content shown to teens on Instagram.

Earlier this month, Meta revealed that it would be putting further restrictions on content settings for young people.

This included regulating search terms designed to prevent teenagers from being exposed to sensitive topics such as eating disorders and suicide.

Zuckerberg, X CEO Linda Yaccarino, TikTok CEO Hsu Chu and other Big Tech leaders will hold a hearing next Wednesday on the “online child sexual exploitation crisis.'' He is scheduled to testify before a Senate panel as part of the



Sign up to stay informed to breaking news