Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Luke Stark
(Western University)
Melissa Adler (University of Western Ontario)
David Nemer (University of Virginia)
Send message to Convenors
- Format:
- Traditional Open Panel
- Location:
- NU-4B43
- Sessions:
- Friday 19 July, -
Time zone: Europe/Amsterdam
Short Abstract:
Numerical techniques and technologies used to inscribe and circulate differences—racial, gendered, class, caste-based, and others—are everywhere in disciplinary and control societies. This panel interrogates the history of these quantitative mechanisms of power, and the ways they have been resisted.
Long Abstract:
Numerous thinkers (including Fanon, Foucault, Deleuze, Crenshaw, Hartman, Freire, and others) have theorized the ways in which the political arithmetic of the capitalist state and its agents has prompted the development of numerical techniques and technologies to inscribe and circulate differences—racial, gendered, class, caste-based, and others—about populations intended for exploitation and dispossession. In this panel, we aim to further interrogate the history of these quantitative, statistical, and inferential mechanisms of power.
From the Enlightenment calculation of early American enslavers to the contemporary deep learning transformations of artificial intelligence (AI) practitioners, and even the human infrastructures that transmit disinformation through messaging apps, numbers have never been neutral nor free from narrative. This panel will focus on the continuities in thinking and practice around the use of numbers as a tool of oppression across both space and time, and the often-nuanced strategies of resistance and cooption that progressive social movements have deployed in response.
Some of the questions we hope to see addressed by the panel include the following:
• How have historical discourses and ideologies of inequality, oppression, and supremacy prompted, shaped, and refined quantitative or technical metrics and mechanisms of scientific or social differentiation?
• How have the developers of these metrics and techniques drawn on transnational or global circuits of oppression?
• What are examples of such technical metrics and mechanisms being developed in an overtly oppressive context and then laundered more general scientific practice? How have professional fields such as medicine, statistics, and psychology grappled with or ignored these historical genealogies?
• How have people and communities targeted by such numerical mechanisms resisted, responded, refused, and coopted these technologies? What have been the results of such encounters, and what lessons can contemporary groups dedicated to opposing unfreedom take from these historical examples?
Accepted papers:
Session 1 Friday 19 July, 2024, -Paper short abstract:
The paper reconsiders discipline, governmentality and control through postcolonial context of Pakistan. In view of the East India Company’s crime statistics and census of British India, the failures of data statistics in Pakistan challenge the Western genealogy of power, revealing a mutated amalgam.
Paper long abstract:
This paper considers the modalities of disciplinary societies, governmentality, and control societies—broadly outlined in the work of Michel Foucault and Gilles Deleuze—in their mutated postcolonial forms following refraction in their colonial encounter. Taking the case of colonial India and postcolonial Pakistan, the paper will genealogically chart the state of quantification techniques informing these modalities of power between the late-eighteenth century to the early-twenty first century. The East India Company deployed techniques of quantification to inform their administration through enumeration and categorization of crimes and offences, corresponding to disciplinary power of penalization, which came to be tabulated to expand colonial rule over native populations. During the late-nineteenth century, the British Raj deployed different numbering surveys of the Indian population in the technology known as the census, which corresponds to statistical techniques of governmentality that Foucault argues enfolded disciplinary power. Since the independence of Pakistan in 1947, the postcolonial state has repeatedly failed to adequately deploy the census as an effective technology of government. In the twenty-first century, computational technologies have brought about a new era of datafication of the population—roughly corresponding to control societies in Deleuze—in Pakistan’s National Database and Registration Authority and draconian cybercrimes laws. In terms of this genealogy, the paper will argue that the neat partitioning of these modalities of discipline, governmentality, and control do not line up in the colonial trajectory, given the failures of quantification. The postcolonial state presents a mutated amalgamation of these modalities, requiring enumerative technologies to be scrutinized in their present forms.
Paper short abstract:
This study explores the genealogy of the use of generative AI as an English-language assistance tool by global scholars. Generative AI in this context is a double-edged sword: the diversity of authors who publish in academia may increase, but only by conforming to expected language norms.
Paper long abstract:
Peer review is a technique to quantify differences between what is and is not publishable in the sciences. One increasingly quantified metric is English. Over 90% of indexed journals in the natural sciences, for example, are published in English, and writing which is not assessed as appropriately English will be rejected. As such, generative AI has been celebrated as a boon for inclusion, allowing scientists to instantly "fix" their writing prior to peer review. At the same time, by performing an automated, imperceptibly quantified version of English, these tools reinforce the global hegemony of English in the sciences by masking the language diversity of academic community members and limiting the exposure of readers to writing in different languages and language registers.
This study explores the genealogy of English metrics in the sciences and the use of generative AI by global scholars to resist those metrics. First, we analyze historical peer reviews and rubrics from major scientific publications to explore how reviewers police, negotiate, and uphold English as a publishing requirement. We compare peer review evaluations which critique English use before and after the introduction of ChatGPT in 2023, exploring the implications of widespread use on adjudication practices. Next, we interview scholars to probe motivations and perceptions of the use and users of automated writing tools. The goal of this project is to use this moment of change in historical publishing norms as a lens to de-legitimate, un-hide, and re-question the global quantification infrastructures which determine who participates in global academia.
Paper short abstract:
The paper shows how proponents of 'Big Data' imagine the future of the state and politics as based on 'data-driven,' 'intelligent' technologies. I critically engage their imaginary of the 'normal' citizen subject, usually based on historical categories and often highly problematic assumptions.
Paper long abstract:
The paper provides a genealogy of the present in analysing how proponents of 'Big Data' from within Big Tech, such as Alex Pentland (2009; 2012; 2014), Eric Schmidt (2013; 2021), Peter Norvig (2013) and others have imagined politics and the state when based on 'intelligent' and other forms of data-driven technologies from 1999- 2016. I focus on their conceptions of the normal citizen subject, the assumed needs of this subject, as well as, notions of objectivity and neutrality.
These conceptions of normalcy often stand in tension with the promise that future technologies will provide highly individualised welfare and care solutions based on data collected about an individual (Pentland 2012; Schmidt 2013). Analysing the idea of the normal citizen and the needs of subjects in this literature shows, I argue, for whom future technologies and state services are imagined and who remains excluded and cannot fully be 'datafied.' The paper proceeds in three steps: First, I presents some of the solutions proposed by the authors for political and welfare problems. Second, I point to the role of data and quantification in those solutions and the experimental design they propose to test them. I then turn to the categorisations involved and how the authors narrate their practices of data production. Therein, I show how they rely on historical often problematic categorisations and practices of quantification. Third, the presentation shows how these narratives produce an 'other' while assuming that they provide solutions that are both perfectly tailored to an individual while being 'neutral.'
Paper short abstract:
This paper historicizes calculations of freedom. I show that early calculators assumed that free societies were made by free citizens, while later calculators assumed that free citizens emerged from an existing free society. I argue that this shift was due to political fears about decolonization.
Paper long abstract:
Over the twentieth century, scientists, engineers, and philosophers struggled to quantify freedom. Merging moral and mathematical arguments, these metrics were used to justify and define the stakes of international intervention in the Cold War. This talk constructs a genealogy of freedom calculations from 1948 to 1990, beginning with calculations done by statistical physicists in the 1940s and ending with the legacy of the widely used metric devised by the Freedom House in the late 1970s. By performing a close reading of the technical formulae for freedom, I connect the changing mathematical assumptions of these calculations to the broader political concerns of western technocrats. I show that geopolitical fears sparked by decolonization drove a marked revision in the freedom calculations. As western technocrats became increasingly concerned with the economic and cultural status of newly sovereign nations, calculations shifted from describing individual freedom to describing the institutions of a free society. This change in mathematical form belied a new political assumption—that free citizens were the product rather than the producers of a free society. I conclude by drawing a line between these historical calculations and the quantification of freedom today, showing how midcentury political fears have been laundered into scientific metrics used in museums, public education, and politics.