Social Sorting as a Tool for Surveillance

The female body is constantly under surveillance - in private spaces as well as in public. Surveillance is about power. It is not just about a violation of privacy, but also an issue of social sorting.

Female bodies are constantly under surveillance in the various spaces that they inhibit, from the “privacy” of the home to public spaces. The male gaze is constantly categorizing and policing the female form to ensure its compliance to larger patriarchal norms. Surveillance, and the datafication of the everyday, is not simply a violation of privacy, but as David Lyon posits, a form of social sorting.[1] The process of social sorting has the effect of creating and intensifying differences:

“For surveillance today sorts people into categories, assigning worth or risk, in ways that have real effects on their life-chances. Deep discrimination occurs, thus making surveillance not merely a matter of personal privacy but of social justice.”[2]

With the rise of surveillance capitalism[3] the everyday, the previously mundane, has been monetized. In other words, the self, in more unprecedented ways, has become datafied and this the subject of social sorting. Building on the work of Gandy[4] and Lyon, it becomes important to see surveillance beyond the libertarian criticism found in mainstream discourse, couched in individual privacy harms, and rather to situate it within feminist discourse that takes into account the differentiated impact of surveillance and social sorting on bodies using gender, class and race as categories of discipline and discrimination. Feminist scholarship around surveillance has introduced the concept of intersectionality to underscore how surveillance is not experienced uniformly and is often directed disproportionately towards deviant and other bodies. Simone Browne in her seminal work Dark Matters, invites us to unpack the very concept of surveillance, not as “a form of watching that is neutral”, but alternatively to see “the inequities between those who were watched over and those who did the watching are revealed”.[5] Furthermore, Feminists have equated surveillance to the “male gaze” which has the effect dehumanizing and policing the object of its gaze. This is a useful concept which accounts for control that is implicit in systems of surveillance.

Social sorting presupposes difference

Social sorting presupposes difference; in that sense it reifies pre-existing differences in society and preserves the status quo. The recent example of Amazon’s recruiting algorithm reproducing gendered hiring patterns involved collection from a pre-existing dataset, i.e. a workplace where women were already underrepresented, and creating ‘favourable” profiles on that flawed basis. Gender, thus, became a marker for employability and the social sorting reflected the status quo that was a result of systemic sexism and patriarchal barriers in STEM fields.

Surveillance is about power

Surveillance is about power; who has the power to collect information, create categories and sort social realities. Law enforcement bodies are engaged in surveillance and sorting in the form of ethnic and racial profiling. The power to declare particular areas as insecure and thus in need of surveillance is often crucial. The Pashtun Tahafuz Movement (PTM), a grassroots civil rights movement emerging from Pakistan’s Federally Administered Tribal Areas (FATA), has included in its major demands the restructuring of “security checkpoints”, identifying them as sites of everyday exercise of arbitrary and discriminatory power by the state. Pashtun bodies and identities are routinely subjected to humiliating and intrusive stop-and-search on the basis of ethnic and racial profiling. Worryingly, these everyday experiences of surveillance have been reified with the introduction of “safe cities” projects across the country, equipping urban spaces with CCTV cameras to collect data and reproduce the patterns of discrimination.[6] Uncategorizable bodies in urban spaces from a gender and class perspective are being actively sought out as deviant and dangerous through the use of these urban surveillance technologies. Homeless people and prostitutes have been forced to avoid surveillance points under the safe cities project or confine themselves to camera blind spots - defining security at the expense deviant bodies.

Transgender bodies resist the rigid binaries imposed by the male gaze, leading to heightened surveillance. In Pakistan, the khwajasara community (roughly mapping onto the Western understanding of transgender) has come under the gaze of the state in recent years.[7] The turning point came with a series of rulings by the Supreme Court of Pakistan in 2009 when it held that individuals identifying as khwajasara would be granted computerised national identity card (CNIC) through the addition of a third column in the gender category. This move to grant the community with more rights also brought them within the surveillance framework of the state in a more formalized manner. The datafication of transgender bodies resulted in their identities being reified as “other” - the third column . While granting of CNIC bestows several benefits onto citizens--the ability to vote, apply for a passport, access to government jobs, obtain a mobile SIM, governmental financial support schemes--it also brings citizens into the categories and databases constructed to not only monitor their activities, but also constrain their identities, stripping their existing fluidity. In several jurisdictions, recognition of transgender identities has not necessarily translated into an accurate reflection of their lived experience on official records. The system still sees gender identities as fixed, thus in Pakistan when a transgender individual seeks to change their gender on official documents they are met with requirements that assume connection to a traditional family structure such as the requirement to bring a male family member which cannot be met by most transgender individuals given their gender identity, on average, severs them from their family.

Social sorting has the effect of confining gender to pre-existing categories and binaries, while at the same time casting aside the uncharacterised/uncharateriable as “deviant” and “other”. These processes of control through othering are not merely technological, rather are grounded in histories of violent oppression. Bernard S. Cohn details in The Census, Social Structure and Objectification in South Asia how the colonial census was central to the “process of classifying and making objective to Indians themselves their culture and society”.[8] This sorting was accompanied by the creation of "castes" and "tribes" in colonial India either classified as martial races or criminal tribes. The Criminal Tribes Act of 1871 not only transformed different groups into criminalized groups, but provisioned for their registration, surveillance and control; surveillance in the form of registration at the district magisterial level and a pass system that required members of the designated criminal castes/tribes to obtain a pass in order leave their village.[9] Thus, the surveillance of individuals through technology is not wholly unique; in fact, the monitoring and surveillance of “deviants” through the systematic collection of “objective” data has been a powerful tool exerting control.

In conclusion, surveillance marks the lived experience of women both in and outside the home. In the South Asian context, Shilpa Phadke, Sameera Khan and Shilpa Ranade describe in Why Loiter the surveillance that female bodies experience in urban spaces.[10] Female and queer bodies are subjected to surveillance in public spaces, a visibility that forces them to constantly justify their presence in these spaces, precluding their access to these spaces as sites of pleasure. Even within the home, women are under surveillance at the behest of the family, policing their behaviour and exercising control over their bodies to ensure performity of gender norms. Technology and the datafication of the self has opened up new sites where surveillance is reproduced through monitoring of personal devices and online accounts both at an individual level by the family and through more systemic processes by the state and private companies.

 

[1] David Lyon, “Surveillance as Social Sorting: Privacy, risk, and digital discrimination”, Routledge, 2003.

[2] Lyon, p. 1.

[3] Shoshana Zuboff, "Big other: surveillance capitalism and the prospects of an information civilization", Journal of Information Technology, 2015, p. 75 - 89.

[4] Oscar Gandy, “The Panoptic Sort: A Political Economy of Personal Information. Critical Studies in Communication and in the Cultural Industries”, Westview Press, 1993.

[5] Simone Browne, “Dark Matters: On the Surveillance of Blackness”, Duke University Press Books, 2015, pg. 18, 21.

[6] “Punjab Government’s Safe Cities Project: Safer City or Over Policing?”, Digital Rights Foundation, 2018, http://drive.google.com/file/d/1Uc3blQzQB-L7tbFt2IBewCIK3Fc535H2/view?t….

[7] This is not to make invisible decades of surveillance, policing and harassment by state actors of informal spaces and activities by the the community.

[8] Bernard S. Cohn, “The Census, Social Structure and Objectification in South Asia”, Anthropologist among the Historians, p. 250.

[9] Rachel J. Tolen, “Colonizing and Transforming the Criminal Tribesman: The Salvation Army in British India”, American Ethnologist, Vol. 18, No. 1, 1991, p. 107.

[10] Shilpa Phadke, Sameera Khan and Shilpa Ranade, “Why Loiter?: Women And Risk On Mumbai Streets”, Penguin Books, 2011.