TikTok has published its Q3 2025 Community Guidelines Enforcement Report, highlighting major strides in proactive moderation, artificial intelligence powered safety systems, and new digital well being tools. The report, covering July to September 2025, provides fresh insight into how the platform is strengthening trust and safety across its global and local communities.
Over 580,000 Videos Removed in Kenya in Three Months
The latest report reveals that TikTok removed more than 580,000 videos in Kenya during the third quarter of 2025 for violating Community Guidelines.
Key highlights from the Kenya data include:
99.7 percent of violating videos were removed proactively before users reported them
94.6 percent were removed within 24 hours of being posted
Around 90,000 LIVE sessions were interrupted for breaching content rules
The disrupted LIVE sessions represented 1 percent of total LIVE streams during the quarter
The figures signal growing investment in automated detection and rapid enforcement across the region.
More Than 204 Million Videos Removed Globally
At the global level, TikTok removed 204,534,932 videos during Q3 2025, representing roughly 0.7 percent of all uploads on the platform.
Global enforcement statistics include:
99.3 percent of violating content removed before user reporting
94.8 percent removed within 24 hours
91 percent detected and removed using automated technologies
118 million fake accounts removed to protect platform integrity
22 million accounts removed for suspected under age usage
These are among the highest proactive removal rates ever recorded by the platform.
AI and Human Moderation Working Together
TikTok says its enforcement approach combines advanced artificial intelligence systems with thousands of trust and safety professionals worldwide. This hybrid model aims to detect and remove harmful content quickly and consistently, including misinformation, hate speech, dangerous or harmful behaviour, and other policy violations. The company says continued investment in automation has been critical in scaling content moderation across its growing global community.
New Digital Well Being Features Launched
Beyond enforcement, TikTok also announced new Time and Well being tools designed to help users build healthier digital habits.
New initiatives include:
A dedicated Time and Well being space launched in November
Four new Well being Missions, short tasks encouraging mindful technology use
A strong focus on helping teenagers use technology with confidence and purpose
These tools reflect broader industry efforts to address screen time and digital wellness concerns.
Transparency Remains a Key Priority
TikTok publishes its Community Guidelines Enforcement Report quarterly to provide transparency into how the platform manages safety, content moderation, and account integrity. The company says regular reporting helps users, regulators, and stakeholders understand the scale of enforcement actions and its ongoing commitment to maintaining a safer digital environment.












