Researchers have created fake accounts for young teens, revealing that the algorithm of the Chinese version of TikTok is indeed promoting porn and other sexual content to users as young as 13.
The app’s algorithms not only allow children to stumble upon explicit material but actively guide them towards it. A report from a non-profit organization, Global Witness, highlights these findings.
To reach their conclusions, the researchers set up a 13-year-old TikTok account, starting fresh without any search history, and used devices they considered clean. They even turned on a “limited mode” intended to help parents safeguard their kids from inappropriate content, as TikTok claims.
When this protection feature was activated, TikTok warned users, saying, “You shouldn’t look at mature or complex themes, such as sexually suggestive content.”
The first tests occurred in early 2025, with follow-ups during the summer.
“As a 13-year-old user, I clicked on the search bar and looked at the suggestions that popped up,” a researcher explained.
They repeated this process multiple times, revisiting the search bar to see which terms were being suggested and what content came with those terms.
All three test accounts indicated that sexual terms were often auto-filled in the TikTok search bar.
Children’s accounts showed suggestions that included phrases like “very rude and revealing attire” and “a woman kissing a man…” leading users to explicit videos showing inappropriate content.
In some dire cases, the suggested terms led to explicit pornographic material, with the researchers finding that such content was masked behind seemingly innocent images.
“One user encountered porn content twice after logging into the app,” said Global Witness, detailing an instance where a user clicked on the search bar and then one of the suggested searches.
A detailed list of TikTok’s search suggestions for children’s accounts can be found here.
The organization noted that other TikTok users have reported similar issues with sexual search suggestions, signaling that this problem may not be isolated.
“I found examples of users sharing screenshots showing inappropriate search recommendations,” Global Witness mentioned, sharing anecdotes from users expressing confusion and concern.
Comments from users included remarks like, “I thought I was the only one experiencing this,” and “What’s wrong with this app?”
Global Witness ran another survey in January to gauge whether TikTok had made any changes since their last experiment, which yielded similar results.
After their initial findings earlier in the year, Global Witness reported their results to TikTok, which stated, “We took action to review the reported content and removed some search recommendations globally.”
“Nevertheless, our later research showed that the issue hadn’t been fully resolved,” the organization remarked. “We provided TikTok an opportunity to comment on our discoveries, noting they claimed to take action on over 90 pieces of content and removed certain suggested searches.”
TikTok informed Global Witness that it is currently evaluating a youth safety strategy.





