March 22, 2026

Content Moderation Trends: TikTok Deletes 590,000+ Videos in Kenya

In its latest transparency report, TikTok has removed over half a million videos in Kenya between April and June 2025 on the basis of violation of its community policies.

As per the Q2 Community Guidelines Enforcement Report, as of the three months, 592,037 videos were taken down. Out of these, 92.9% were removed prior to viewing and 96.3% of them were removed within 24 hours after they had been uploaded.

The numbers represent a drastic increase in comparison to the past quarters with 450, 000 videos being taken off in early 2025 and 360,000 over the same time in 2024. TikTok indicated the growth as a result of its enhanced global content management to create a safer online environment.

The company added that this move is in line with the global moderation impact of TikTok.

The platform has been experiencing mounting pressure in Kenya since a BBC investigation in March 2025 found that underage children were being exposed to sexualised livestreams on the platform. The report created a lot of alarm among parents, educators and regulators concerning the safety of children online.

After the expos, the Communications Authority of Kenya (CA) directed Tik Tok to delete the explicit livestreams, re-evaluate its content regulation mechanisms, and intensify its digital literacy efforts to assist minors to avoid inappropriate content.

The increased number of Tik Tok takedowns seems to be one of the answers to those regulatory measures by Tik Tok.

The company has been shown to have taken down more than 189 million videos across the globe in the same time frame which is approximately 0.7 percent of the total content uploaded. Ninety-nine point one percent of them were proactively identified and ninety four point four percent were eliminated in less than a day. Another information also stated by Tik Tok 163.9 million videos were automatically detected in its AI code-checking mechanisms and removed.

TikTok also censored fake and underage accounts in addition to the removal of videos. It reported the removal of 76.9 million fake profiles, as well as 25.9 million accounts that the company believed were run by underage users.

The report said that, with combining advanced automated technologies and thousands of trust and safety professionals, Tik Tok is able to eliminate the harmful content in a brief time and with continuousity.

The company also unveiled enforcement according to its instructions of live monetisation, which regulates paid livestreams. In April, May and June, Tik Tok has warned or demonetised 2.3 million live sessions and more than one million creators. It claimed that these moves were to inform the users whose content might have contravened the monetisation policies.

TikTok challenged its users to participate actively in a safe platform by reporting illegal content, comments, or accounts.

The company stated that by collaborating with our community, TikTok is creating a safe online space where creators and good feelings have the capacity to flourish.

The latest report by Tik Tok indicates that the company is making more efforts to meet the demands of Kenyan regulations and regain the confidence of the people after several months of negative publicity.

Leave a Reply

Your email address will not be published. Required fields are marked *