DIGITAL PURDAH: AI CENSORSHIP, WOMEN’S EXPRESSION, AND THE CRISIS OF LEGAL PROTECTION IN THE AGE OF SURVEILLANCE CAPITALISM
DOI:
https://doi.org/10.63075/ed2gbp28Abstract
This research investigates the intersection of artificial intelligence (AI), gendered censorship, and legal vulnerability within the framework of surveillance capitalism. As Shoshana Zuboff (2019) asserts, “surveillance capitalism unilaterally claims human experience as free raw material for translation into behavioral data.” This insight underscores how algorithmic systems commodify women’s online behaviors to reinforce structural silencing. the intersection of artificial intelligence (AI), gendered censorship, and legal vulnerability within the framework of surveillance capitalism. Drawing on Shoshana Zuboff’s theory of surveillance capitalism and critical legal theory, the study conceptualizes “Digital Purdah” as a modern mechanism of silencing and disciplining women’s digital expression. In societies with strong patriarchal traditions—such as Pakistan—AI-driven content moderation systems, often trained on biased data and developed without transparency, disproportionately suppress women’s speech, particularly when it challenges cultural or political norms. This article employs feminist jurisprudence, case study analysis, and empirical digital rights data to uncover how algorithmic governance reproduces patriarchal silencing under the guise of technological neutrality. The concept of Digital Purdah redefines algorithmic bias as a gendered and culturally specific tool of suppression, rather than a generic flaw in data science. It positions AI censorship not merely as a technical problem but as an urgent legal and feminist issue, requiring systemic redress. The study argues for a feminist re-imagination of legal protections and algorithmic transparency to dismantle the structural “purdah” imposed by AI on women in digital spheres. Ultimately, this research contributes to the broader discourse on digital human rights by centering marginalized gendered voices and advocating for the democratization of algorithmic accountability.