Instagram will test a feature that blurs messages containing nudity to protect teens and prevent potential scammers from accessing it, parent company Meta said Thursday, allaying concerns about harmful content on the app. announced that.
The tech giant is under increasing pressure in the United States and Europe over allegations that its apps are addictive and fuel mental health problems among young people.
Mehta said Instagram’s Direct Message Protection feature uses on-device machine learning to analyze whether images sent through the service contain nudity.
This feature will be turned on by default for users under 18, and Meta will notify adults to turn this feature on.
“Nude protection also works in end-to-end encrypted chats, as images are analyzed on the device itself, where metas cannot access these images unless someone chooses to report them to us. ” the company said.
Unlike Meta’s Messenger and WhatsApp apps, direct messages on Instagram are not encrypted, but the company said it plans to roll out encryption for the service.

Meta is also developing technology to help identify accounts that may be involved in sextortion fraud and is testing new pop-up messages for users who may have interacted with such accounts. He said that
The social media giant announced in January that it would hide more content from teens on Facebook and Instagram, making them less likely to encounter sensitive content like suicide, self-harm and eating disorders. He added that it would be.
Attorneys general from 33 US states, including California and New York, sued the company in October, accusing it of repeatedly misleading the public about the dangers of its platform.
In Europe, the European Commission has requested information on how Meta protects children from illegal and harmful content.





