By Advocate Mahnoor Nawaz
Introduction
Artificial intelligence (AI) has become the backbone of digital change in modern society, promoting innovation in government, healthcare, commerce, and education. It has significantly changed the way personal data is collected, processed and applied across various fields. While various AI systems are dependent on large-scale personal data processing, including location data, fingerprints, communications metadata, and behavioral profiles, these advancements in technology raise substantial concerns about data protection and personal privacy. Pakistan should balance the benefits of AI- driven policing and provide explicit legal measures to protect
individual privacy, avoid abuse, and maintain democratic freedoms. The following article sets out the present legal situation and the operation challenges encountered by Pakistan.
The current legal landscape in Pakistan
Pakistan currently lacks a robust and comprehensive regulatory legal frameworks and public accountability mechanisms, raising concerns about civil liberties and democratic oversight. Pakistan regulatory system has not kept pace with rapidly growing AI technology. Innovation in Pakistan is evolving rapidly and regulation on the other hand is incremental. There are substantial gaps in policies in terms of data privacy, ethical use of AI, transparency, and accountability, which leaves room for exploitation, potentially resulting in the misuse of personal data, algorithmic bias, surveillance overreach, and other harms to individual rights and public trust. Pakistan relies on Prevention of Electronic Crimes Act (PECA) 2016, which takes more of a cybercrime approach. The provisions in the Prevention of Electronic Crimes Act 2016 (PECA) are largely focused on investigating and prosecuting cybercrimes, as opposed to establishing a robust, rights-based framework for personal data protection. Moreover, the Ministry of Information Technology and Telecommunications has published a draft of the Personal Data Protection Bill 2023 (“PDPB”), which, if enacted, would establish a statutory framework for consent, data-controller requirements, breach reporting, cross-border transfers, and a data protection authority. However, the Bill is at the draft stage and yet to be passed. If comparing PECA AND PDPB, the PECA focusses on penalising cybercrimes, whereas the PDPB focusses on regulating data use and preserving privacy.
PECA 2016 and recent social-media control
Social media has democratised information distribution in Pakistan, allowing individuals to freely exchange news and ideas. It has emerged as an important tool for social campaigns and grassroots movement. Yet, Pakistan’s social media growth has not been without its drawbacks. PECA 2016 authorises the Pakistan Telecommunication Authority (PTA) to prohibit or remove online information that is blasphemous, anti-state, hateful, or obscene. It also criminalises a variety of social media offences, including online defamation, character assassination, cyber harassment, and identity theft. However, The Prevention of Electronic Crimes Act 2016 (PECA) and related digital legislation in Pakistan have received widespread condemnation from legal professors, human rights organisations, journalists, and technological specialists. Such legislation has been deemed one of the worst and most controversial legislation with respect to freedom of expression. PECA 2016 is said to be punitive rather than protective.
AI policy and AI-specific legislation
Pakistan’s Federal Cabinet adopted a landmark policy, the Artificial Intelligence Policy 2025, which is a strategic framework aimed at fostering the responsible development and use of AI technology in the country. To assure Pakistan’s worldwide competitiveness in this rapidly emerging industry, the policy emphasizes on stimulating innovation, supporting research and development, and developing AI capabilities among the workforce. Despite it promising transformative growth and innovation, it faces deep critics. Serious concerns have been raised about the AI Policy 2025’s approval process’s failure to uphold the values of meaningful public participation, transparency, and inclusivity. Unlike UNESCO’s Recommendation 2021, The Council of Europe’s (CoE) 2024 AI Framework Convention and other global frameworks, Pakistan AI Policy 2025 does not place human rights at the core.
Surveillance, Privacy, and AI Risks in Pakistan
According to recent investigations and human rights studies, Pakistan already has technologies in place that enable the government to keep a close eye on internet activity, phone conversations, and texts. Although these devices are legally permitted (referred to as “lawful intercept”), they allow authorities to monitor the activities of numerous individuals on the internet and on their phones, which raises concerns over privacy and individual liberties. International report flags illegal surveillance in Pakistan; claims using Chinese-built internet firewall. This surveillance increases the likelihood of widespread profiling, monitoring, and targeted suppression of dissent. There is a clear need for a stronger legal and regulatory framework that protects human rights and privacy, rather than enabling practices that breach them. To ensure that AI and digital monitoring are used without violating fundamental rights, while still allowing for innovation and legitimate government functions, effective safeguards, transparency, and independent oversight are essential.
Comparison of Pakistan and the United States in terms of AI, data protection, and surveillance frameworks
Pakistan lacks a fully implemented Personal Data Protection Act, relying on PECA 2016, which focuses on cybercrime, while draft laws and AI policies exist but implementation and oversight remain limited. Whereas, Unlike the EU GDPR, the United States has no comprehensive federal data protection law, relying instead on sectoral regulations and state laws like CCPA/CPRA, with emphasis on consumer consent and transparency.
Pakistan and the United Kingdom in terms of AI, data protection, and surveillance frameworks
In terms of privacy and AI use, the UK has a strong framework thanks to the GDPR and Data Protection Act of 2018, which provide explicit regulations, more stringent enforcement, and independent supervision. Furthermore, the Computer Misuse Act 1990 of the United Kingdom strengthens legal clarity and enforcement by providing precise and comprehensive definitions of unauthorized access and other cybercrimes. The UK’s Online Safety Act 2023 and Data Protection Act 2018 address AI-related harms, whereas the vague definitions in Pakistan’s PECA Act allow room for misuse. Last but not least, Pakistan lacks a systematic framework for regulating AI, while the UK’s AI Strategy provides a defined structure to encourage responsible AI use and prevent misuse.
A balanced regulatory approach for Pakistan
To balance innovation and privacy, Pakistan should enact a strong, independent Personal Data Protection Act which ensure that both public and private actors respect privacy and human rights. It should embed AI-specific obligations within the PDPA framework. Furthermore, it should provide algorithmic transparency and redress. Pakistan should implement provisions which ensure protection against mass surveillance and function creep and should emphasise strict penalties for unauthorised access, misuse, or abuse of personal data, ensuring that both public and private actors are held accountable for violations.
Conclusion
Pakistan’s technological development is approaching a critical turning point. The country’s expanding digital economy and growing interest in artificial intelligence present vast opportunities for innovation and progress. However, these opportunities must be accompanied by robust legal frameworks and ethical standards to ensure that technological advancement aligns with societal values and safeguards individuals’ personal data. This can be done by promoting collaboration between policy makers, technologists, and civil society. Consequently, this will enhance Pakistan’s credibility in the global digital economy.

