A report by a campaign group claims TikTok promotes pornography and sexualised videos to minors. Researchers created fake child accounts, enabled safety settings, and still saw explicit search prompts. These led to clips of simulated masturbation and even pornographic sex acts. TikTok says it acted swiftly after being alerted and stresses its focus on safe use for children.
Fake child accounts reveal explicit material
In July and August, Global Witness researchers set up four TikTok profiles. They posed as 13-year-olds by entering false birth dates. The app did not request further identity checks. Investigators enabled the platform’s “restricted mode”. TikTok advertises this feature as a filter for sexual or mature themes. Despite that, the accounts received sexual search suggestions under “you may like”. These linked to videos of women flashing underwear, exposing breasts and simulating masturbation. At the most extreme, investigators found pornography hidden inside innocent-looking clips to avoid moderation.
Global Witness issues sharp warning
Ava Lee from Global Witness called the results a “huge shock”. She said TikTok not only fails to protect children but actively promotes dangerous material. Global Witness usually examines how technology impacts democracy, human rights and climate change. The group first discovered explicit material on TikTok while researching a different subject in April.
TikTok defends its measures
Researchers reported their findings earlier this year. TikTok said it removed the material and corrected the problem. But when Global Witness repeated its test later in July, sexual content appeared again. TikTok insists it has more than 50 protective tools for teenagers. It claims nine out of ten violating clips are deleted before anyone views them. After the latest warnings, the company announced improvements to its search system and further removals of harmful content.
Children’s Codes increase legal pressure
On 25 July, the Children’s Codes within the Online Safety Act took effect. Platforms must now use strong age checks and stop minors from seeing pornography. Algorithms must also block harmful material linked to self-harm, eating disorders or suicide. Global Witness repeated its research after the new regulations came into force. Ava Lee urged regulators to act, stressing children’s online safety must become a priority.
Users question the platform
During the study, researchers also monitored user reactions. Some expressed confusion about sexualised recommendations. One asked: “can someone explain to me what is up with my search recs pls?” Another commented: “what’s wrong with this app?”
