The short-form video platform TikTok has reported removing 25,448,992 videos in Pakistan during the second quarter of 2025, citing violations of its Community Guidelines.
The period in question, from April to June 2025, shows a very high proactive enforcement rate: 99.7 % of the removed videos in Pakistan were flagged and deleted by TikTok’s own systems before user reports came in. Furthermore, 96.2 % of those videos were taken down within 24 hours of being posted.
This local figure sits within a much larger global moderation operation. Worldwide in Q2 2025, TikTok removed 189,578,228 videos, representing approximately 0.7 % of all uploads on the platform. Of those globally removed videos, 163,962,241 were taken down using automated detection technologies; and 7,457,309 videos were reinstated following human review.
In addition to video removals, TikTok also acted on accounts. The platform deleted 76,991,660 fake accounts and 25,904,708 accounts suspected of belonging to users under the age of 13.
Breakdown of Global Violations
Globally, the types of violations prompting removal were varied:
-
30.6 % of removed videos contained sensitive or mature themes.
-
14.0 % violated the platform’s safety and civility standards.
-
6.1 % breached privacy and security guidelines.
-
45.0 % were flagged for misinformation.
-
23.8 % contained edited or AI-generated media.
These numbers indicate that the moderation challenge is multifaceted: mature content, safety issues, privacy violations, misinformation and deep-fakes/edited content all contribute significantly.
Why Pakistan’s Numbers Matter
Pakistan represents a key market for TikTok, both in terms of user base and regulatory scrutiny. The high ratio of proactive removal (99.7 %) and the rapid pace of take-down (96.2 % within a day) suggest that TikTok has stepped up moderation efforts significantly in Pakistan.
Moreover, Pakistan has a track-record of imposing bans on the platform over complaints of “immoral, obscene and vulgar” content, with restrictions first introduced in October 2020. These enforcement figures may be part of TikTok’s response to maintain access and compliance in the country.
Implications for Users, Creators and Regulators
For users, the high removal numbers mean that content deemed to breach guidelines is unlikely to persist, which could improve the overall experience of the platform in Pakistan. For creators, the message is clear: compliance with TikTok’s Community Guidelines is essential. Videos may be removed quickly, often before users report them. For regulators, the transparency in publishing enforcement data can help frame discussions around platform responsibility, youth safety and content governance.
What This Means Moving Forward
TikTok’s regular “Community Guidelines Enforcement Reports” aim to deliver transparency and accountability around content moderation. By continuing to publish detailed data, the platform signals a willingness to be held accountable for how it manages content risks at scale.
On the Pakistan front, the large volume of takedowns suggests that TikTok may be intensifying its moderation infrastructure locally — whether through automated systems, regional moderation teams or updated policies attuned to regional sensitivities.
However, it also raises questions: what types of content are being flagged in Pakistan disproportionately compared to other markets? Is the threshold for removal similar globally or adjusted regionally? And how are creators being educated or supported to stay within the rules?
In Q2 2025, TikTok removed over 25 million videos in Pakistan alone for guideline violations, with the vast majority identified proactively and removed within 24 hours. Globally, nearly 190 million videos were purged, alongside millions of fake or under-aged accounts. These numbers reflect the scale of moderation challenges facing major social platforms today, and underscore how seriously TikTok is treating content governance — especially in regions with heightened regulatory expectations.

