TikTok Algorithms Expose Minors to Adult Content, Report Finds
A recent report highlights that TikTok’s algorithms not only make it easy for children to stumble upon pornography and sexual content but actively direct them toward such material.
Researchers set up an account posing as a 13-year-old user. Surprisingly, they were swiftly shown recommendations for explicit search terms, even with safety settings enabled, according to Global Witness, a non-profit organization focused on researching digital threats.
The suggested search terms for the account included phrases like “hardcore pawn clips” and “very rude and revealing outfits.” This resulted in viewing content featuring women simulating masturbation and exposing underwear and breasts.
In one instance, the research indicated that the app led users to pornographic videos. Apparently, TikTok was combining explicit content with innocuous videos to bypass its moderation efforts.
One user experienced this twice after logging in, once through the search bar and another from recommended searches, which raised serious concerns about TikTok’s practices. Global Witness emphasized that the issue goes beyond simply having porn on the platform—it’s about how TikTok’s algorithm pushes minors toward it.
The organization promptly reported these explicit findings to TikTok. In response, a TikTok spokesperson stated that they took immediate steps to investigate and remove content that violated their policies. They also noted an effort to enhance their search suggestion functionality, claiming they could remove the majority of violating videos before users view them.
This report comes shortly after President Trump signed an order allowing TikTok’s U.S. operations to be overseen by a group of American investors.
Henry Peck, who leads the digital threat campaign at Global Witness, expressed surprise at these findings, admitting they typically don’t focus on children’s digital safety. He stumbled upon the explicit content while conducting different research in April.
Despite TikTok’s assurance of having improved measures, Global Witness found explicit content recommendations again during follow-up experiments in July and August. They noted that similar inappropriate search suggestions appeared on accounts created in the UK, even on clean devices without search history.
Teenagers are particularly drawn to video apps like TikTok. Recent statistics show that about 60% of users engage with the platform daily. Additionally, many TikTok users have voiced their concerns about explicit search suggestions, posting screenshots to share their discomfort.
Comments on these posts include thoughts like, “I thought I was the only one,” and inquiries on how to resolve such issues, reflecting a growing unease among users about the platform’s content curation.
“While TikTok claims to have safety measures in place for children, our investigation found that minors are indeed being served pornography,” Peck stated. “It’s clear now that regulatory action is necessary.”

