TikTok deleted over 590,000 videos from Kenya in the three months to June for breach of its social media guidelines such as glorification of sexual, violence and hate speech.
In its Quarter 2 2025 Community Guidelines Enforcement Report (CGER), the Chinese company said it took down a total of 592,037 videos in the four months, 92.9 percent before they were even viewed.
“Notably, 92.9 percent of these were removed before they were viewed and 96.3 percent removed within 24 hours of being posted,” Tiktok said in a statement.
TikTok’s community guidelines prohibit any content that promotes violence, crime, hate speech, harassment, or abuse.
Users are barred from sharing material that “promotes violent acts,” threatens others, or supports hate groups, extremists, or criminal organisations.
The platform also bans content that involves sexual exploitation, human trafficking, or the abuse of children and adults. Harassment, bullying, and doxing are not allowed, and while political commentary is permitted, TikTok removes posts that “cross into severe harm.
Content depicting suicide, self-harm, dangerous challenges, or disordered eating is also restricted to protect users’ mental health. The video-sharing platform also forbids explicit sexual material, graphic violence, and animal cruelty. TikTok removes misinformation, especially about elections, public health, and civic processes, and requires clear labelling of AI-generated or heavily edited media.
Similarly, users are not allowed to share plagiarised or unoriginal content, manipulate engagement through fake activity, or promote fraudulent schemes.
While the company did not provide further information on which of the guidelines Kenyan users most violated, the platform has previously been criticised for the proliferation of explicit content on its Live feature.
In 2023, President William Ruto had to engage TikTok CEO Shou Zi Chew over the need for moderation amid a petition in Parliament seeking the app’s ban in Kenya.
The recently removed videos are part of 189 million removed worldwide during the same quarter, which TikTok said represents 0.7 percent of all content uploaded.
“In the second quarter of 2025, we took action, including warnings and demonetisation, on 2,321,813 Live sessions and 1,040,356 Live creators for violating our Live monetisation guidelines,” said the company.
Some 76,991,660 fake accounts were also taken down, along with an additional 25,904,708 accounts suspected to belong to users under the age of 13, the minimum age to use the platform.