A new investigation claims TikTok recommends pornography and sexualised content to minors. Researchers created fake child accounts, activated safety settings, and still received explicit search suggestions. These included clips showing masturbation simulations and pornographic sex. TikTok says it acted quickly after being alerted and insists it remains committed to safe, age-appropriate experiences for young users.
Fake accounts reveal hidden dangers
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds using false birth dates. The platform did not request further identification. Investigators activated TikTok’s “restricted mode”. The company promotes this feature as a filter against sexual or mature material. Despite this, accounts received sexualised search suggestions in the “you may like” section. These led to videos of women flashing underwear, exposing breasts, and simulating masturbation. At its most extreme, pornography appeared hidden in seemingly ordinary clips to bypass moderation.
Campaign group issues warning
Ava Lee from Global Witness described the findings as a “huge shock”. She said TikTok not only fails to protect children but actively recommends harmful content. Global Witness usually studies how technology affects democracy, human rights, and climate change. The organisation first encountered TikTok’s explicit material during unrelated research in April.
TikTok defends safety measures
Researchers reported the issue earlier this year. TikTok said it removed the flagged content and implemented fixes. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok says it has more than 50 safety features for teenagers. It claims nine out of ten violating clips are deleted before anyone views them. After the latest report, the company said it upgraded search tools and removed additional harmful content.
New regulations increase responsibility
On 25 July, the Children’s Codes under the Online Safety Act came into force. Platforms must enforce strict age checks and prevent minors from accessing pornography. Algorithms must also block content linked to self-harm, suicide, or eating disorders. Global Witness conducted a second study after the rules came into effect. Ava Lee urged regulators to step in, stressing that children’s online safety must now be enforced.
Users express concern
During the investigation, researchers observed TikTok users’ reactions. Some questioned why sexualised search suggestions appeared. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
