SELECT LANGUAGE BELOW

FTC Calls Instagram a Safe Haven for Groomers at Meta Antitrust Trial

FTC Highlights Instagram’s Risks for Minors in Ongoing Antitrust Trial

Internal documents from Instagram, shared by the FTC during its antitrust case against Meta, reveal that an automated algorithm on the platform suggests connections between children and potential predators. The 2019 report, titled “Inappropriate Interactions with Children on Instagram,” outlines how this recommendation system facilitates groomers locating minors.

The FTC reported that 27% of suggestions made by Instagram involved accounts demonstrating predatory behavior toward minors. Over three months, it was found that around 2 million accounts belonging to minors were being highlighted by these groomers. Furthermore, minors made up about 7% of all recommendations directed at adult users on the platform.

In addition, the FTC disclosed findings from an analysis of 3.7 million user reports of inappropriate comments, revealing that a third originated from minors. Notably, 54% of those younger users flagged comments from adult accounts.

This information supports the FTC’s claim that Meta’s acquisition of Instagram was anti-competitive and harmful to consumers. Emails and testimonies from co-founder Kevin Systrom suggest that CEO Mark Zuckerberg intentionally restricted resources for security measures due to concerns over Instagram’s growth and potential negative impacts on Facebook.

Meta executives, including Chief Information Security Director Gei Rosen, have acknowledged that Instagram lagged behind Facebook in addressing issues like child exploitation until mid-2018. However, internal documents indicated that the safety teams at Instagram were under-resourced, facing challenges in tackling serious threats such as harassment, violence, and child exploitation.

A Meta spokesperson responded regarding Instagram’s policies for protecting young users, stating that accounts for teenagers come with built-in protections to limit who can reach out to them. These accounts are set to private by default, meaning only approved followers can view the content, and stringent messaging settings are applied. Furthermore, it was noted that teenagers do not recommend each other’s accounts to one another.

They emphasized that the company has long prioritized child safety efforts. Since 2018, Meta has been working to restrict recommendations from potentially suspicious adults and has taken steps to remove numerous accounts that violate guidelines. They have also enhanced cooperation with the National Center for Missing and Exploited Children to address grooming situations that were previously overlooked.

For further details, please refer to Bloomberg.

Facebook
Twitter
LinkedIn
Reddit
Telegram
WhatsApp

Related News