The startling findings of a seven-month analysis released on Thursday found that Instagram’s algorithms regularly serve sexually provocative videos featuring revealing sex content creators to teen users as young as 13 years old.
The Wall Street Journal Northeastern University researchers investigated the Mark Zuckerberg-led app’s filters by creating an account posing as a fictitious 13-year-old and scrolling through Instagram’s Reels video feed, which reportedly immediately began showing recommended content.
Initially, the racy videos featured women dancing provocatively and exposing their breasts, the report said.
As the account watched these videos, skipping past others, the content became more explicit, culminating in a video “less than 20 minutes” featuring an online sex worker promising to send viewers nude images, according to The Wall Street Journal.
In a series of tests conducted in June, Instagram began showing a “rush of videos about anal sex” to a fictional 13-year-old girl who had previously watched videos about women in her Reels feed, according to The Wall Street Journal.
In other cases, the recommendation algorithms showed videos of women caressing themselves, mimicking sexual acts, or even exposing their genitals to the camera, according to The Wall Street Journal.
The report said the risqué videos were sometimes shown alongside advertisements for major corporate brands.
Meta disputed the report’s findings, with spokesman Andy Stone saying it was “an artificial experiment that doesn’t match the reality of teenagers using Instagram.”
“As part of our long-standing work on youth issues, we have begun efforts to further reduce the amount of sensitive content teens may encounter on Instagram, and have made significant reductions in the past few months,” Stone added.
The Post has reached out for comment.
The analysis was conducted by The Wall Street Journal over a seven-month period from January to June, according to the article, and the results were replicated by Laura Edelson, a computer science professor at Northeastern University.
The test accounts didn’t follow any other accounts or like any posts. To test how quickly Instagram could ramp up illicit recommendations, participants scrolled through Reels and watched sexually-charged videos while skipping others.
The WSJ reported that it ran similar tests with Snapchat and TikTok, and found that neither app recommended sexually explicit content to the test accounts under similar conditions.
Meanwhile, current and former Meta employees told the outlet that internal testing had uncovered issues dating back to 2021 that allowed inappropriate content to be served to underage users.
According to one internal report in 2022, Meta found that teenage users viewed three times as many nudity-themed posts as adults.
Meta has repeatedly said it takes steps to ensure teen users have an “age-appropriate experience” on its apps.
The Journal found the illegal content even though Meta introduced stricter content modes in January to prevent teen users from being exposed to inappropriate material.
As part of the new restrictions, users under the age of 16 will no longer be able to view sexually explicit material in their feeds.
The damning report is yet another headache for Meta, which is currently facing a major federal lawsuit from dozens of states that allege its app is contributing to a mental health crisis among young people.
Meta is also being sued in a separate lawsuit from the state of New Mexico, which alleges the company failed to protect underage users from sexual predators operating on the app.
As reported by the Post, documents from the lawsuit revealed that executives from Walmart and Tinder’s parent company, Match Group, complained to Meta after learning its ads were appearing next to content that sexually portrayed minors.
In January, Meta CEO Mark Zuckerberg offered a shocking apology to the families of victims of online child sexual abuse during a high-profile congressional hearing.
“No one should have to go through what your family has gone through,” Zuckerberg said at the time, “which is why we’re investing so much, and will continue to work across our industry, to make sure no one goes through what your family has gone through.”

