Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Claudia Aradau
(King’s College London)
Tobias Blanke (Kings College London)
Annalisa Pelizza (University of Bologna and University of Aarhus)
Dieuwertje Luitse (University of Amsterdam)
Send message to Convenors
- Chairs:
-
Tobias Blanke
(Kings College London)
Dieuwertje Luitse (University of Amsterdam)
- Discussants:
-
Annalisa Pelizza
(University of Bologna and University of Aarhus)
Claudia Aradau (King’s College London)
- Format:
- Traditional Open Panel
- Location:
- Theater 1, NU building
- Sessions:
- Friday 19 July, -, -
Time zone: Europe/Amsterdam
Short Abstract:
How do digital technologies inscribe lines of difference, and enact figures of the other? How do those rendered as ‘others’ critique, appropriate, repair or resist these technologies? The panel invites contributions that explore the ambivalent connections between digital technology and otherness.
Long Abstract:
The politics of technology is strewn with figures of otherness, which delimit the human from the inhuman, subhuman, infrahuman or nonhuman. Digital technologies have been shown to intensify and solidify lines of difference and alterity. They enact others in new ways and remake normality permanently. They enable new means of identifying others as suspect, risky or dangerous. At the same time, digital technologies encounter practices of critique, as they are variously contested, appropriated, maintained or repaired. They can become tools of creating frictions and contesting distinctions, of enabling refusal and resistance.
This panel invites critical inquiries into the technopolitical production of otherness – human or other-than-human – and explorations of political engagement with technology by those enacted as ‘others’. From AI-inscribed biases to exclusionary welfare, from sensing to the securitized processing of citizens, we propose to map the technopolitics of otherness in the digital age. We aim to understand the ‘scripts of alterity’ that underpin digital technologies and data infrastructures, namely the assumptions about who others are, their capabilities and limitations (Pelizza and Van Rossem, 2023). These scripts have performative effects and can be met with resistance, dis-inscription or transformation.
We encourage contributions that attend to the continuities and discontinuities that digital technologies and data infrastructures bring about in the technopolitical enactment of otherness. What kinds of scripts of alterity do digital technologies and data infrastructures (re)produce and how do these scripts intersect, build on or diverge from broader governmental apparatuses to manage the conduct of others? We welcome contributions that critically address conceptual and methodological challenges, and open towards horizons of freedom and living otherwise with digital technologies. We are especially interested to hear from diverse global contexts that help us understand not just similarities and dissimilarities in the technopolitics of otherness, but also circulations of scripts and technologies.
Accepted papers:
Session 1 Friday 19 July, 2024, -Paper short abstract:
This contribution, forming part of my PhD thesis, takes a historical approach to show the (re)production of austerity in the British digital welfare state, predominately (but not limited to) its (re)production as intersectional inequality.
Paper long abstract:
Learning from scholars who focused on the United Kingdom as well as other jurisdictions that innovate in the spirit of new public governance and new public analytics, this contribution takes a historical look at the automation of social security tasks in the United Kingdom, focusing on the proprietary nature of the expertise and the infrastructures that underpin social security administration. Learning from primary sources from The National Archives, a historical approach to the British digital welfare state uncovers the computational foundation of the shift from welfare to workfare. Another way in which austerity got encoded is shown through examples of automation that replace discretion in administrative practice, and reforms such as Universal Credit that introduce indebtedness to the existing policy of (exclusionary) deservingness.
In the spirit of risk management, benefit fraud risk assessment and unemployment risk assessment are analysed, noting how the use of machine learning systems in social security helps us see the (re)production of intersectional inequality. The application of a technological, often perceived as neutral, method - segmentation - is shown in relation to the segregation it breeds. Taking a historical approach to the British digital welfare state helps us see how the (re)production of exclusionary welfare relates to dependency and coloniality. Noting the shift from the British Empire to the British welfare state, my contribution helps us read austerity in its (dis)appearance as intersectional inequality, and through other (re)forms and computational forms of othering (re)produced in the (infra)structures of the British welfare state.
Paper short abstract:
This paper conceptually explores the construction of ‘data bodies’ as technopolitical enactments of constituted medical datasets, machine-learning models and infrastructures and their role in producing categories of ‘others’ on which (biassed) AI-based medical-decisions can possibly be made.
Paper long abstract:
The growing development and implementation of Artificial Intelligence (AI) technologies in “high-stake” domains such as medicine has increasingly raised concerns over the bias and social inequalities these systems risk to reproduce. To better understand and mitigate these issues, much critical work in AI ethics has primarily focused on scrutinising single components such as the medical datasets used to train and test machine-learning models in the field. However, while such elements are vital, AI-inscribed bias may emerge through a series of entangled processes that underpin medical AI development pipelines: the construction and annotation of datasets as well as model production and evaluation. It is therefore critical to further investigate these situated practices and how they are producing categories of ‘others’ on which biassed AI-based medical-decisions can possibly be made.
Following these observations, this paper takes a combined Critical AI and STS research approach to conceptually investigate the construction of ‘data bodies’ as technopolitical enactments of constituted medical datasets, machine-learning models and their underlying infrastructures for AI-based medical decision-making. Drawing on the notion of the ‘body multiple’ (Mol, 2002) data bodies are constituted and always ‘oscillate between multiplicity and singularity’ (Bucher, 2018). Exploring their construction, I argue that medical dataset construction and annotation protocols can be understood as ‘scripts of alterity’ (Pelizza & Van Rossem, 2023) that reveal various ideas and assumptions about medical conditions and patients. As such, they participate in the continuous AI-induced reconfiguration of the lines between self and various others that are potentially targeted for medical treatment.
Paper short abstract:
This paper explores how older (non)users of digital technologies remain the epitome of the “other” in Western digitized societies. We show how discourses and practices of digital inclusion in France reinforces the otherness of this public, who is both omnipresent and invisibilized.
Paper long abstract:
This paper proposes to study the relationships between aging and digital technologies, by showing how, despite the increasing Internet usage rates among older adults, they remain in discourse and in practices like the “others” par excellence of digitalized societies. Based on a study researching online violence experienced by older adults in France, we will explore the way in which the figure of elderly users is both invisiblized and omnipresent in the making of digital inclusion. Just like those of the youngest generations, digital uses of the oldest are thought of as needing to be supervised from a paternalistic prism, according to standards prescribed by the “non-others” - that is to say the middle-aged adults. While young people are an explicitly identified public whose digital education is carried out in a structured manner within school, older people are less often the subject of specific measures. The French digital inclusion policies implemented in recent years are aimed at all people concerned by the digital divide, but the majority of the users of these services are elderly people, and particularly isolated senior women. However, professionals are not always trained to cater to the specific needs of this public, which remains “unthought of”. These negative representations and practices of the digital uses of older people have performative effects: a poor self-image and a fear of tools can slow down the development of uses and reinforce exclusion, particularly in a context where elders, unlike other socially marginalized groups, are rarely mobilized politically on these issues.
Paper short abstract:
This research aims to critically analyze how the data of pregnant women is utilized and commodified differently from the data of their male partners and how the women are enacted as the other during this process by an extended case study of a Chinese parenting app.
Paper long abstract:
In 2023, a few users of Douyin (Chinese version of TikTok) reported an unsettling phenomenon: when a heterosexual couple registers together on a Chinese parenting app, they will be assigned a “mom in pregnancy” tag and a “dad in pregnancy” tag respectively. According to the tags, more baby products will be shown to the mother, while sexually suggestive content and advertisements for prostitutions will be shown to the father. This paper aims to critically analyze this case within the framework of critical data studies, with a particular focus on how the data of pregnant women is utilized and commodified differently from the data of their male partners and how the women are enacted as the other during this process. The commodification of the “pregnant” data in dyadic relationships is a historical phenomenon in China, as some netizens reported that one common spot for the distribution of prostitution flyers is the gate of obstetrics and gynecology hospitals. However, it is still understudied how big data has reshaped and exacerbated this process. By visiting previous literature in critical data studies, Marxist feminism, and digital capitalism, this research aims to discuss a) how data collected from women in a dyadic relationship is commodified and what are the follow-up actions; b) how digital capitalism, if not all capitalism, plays a role in perpetuating the otherness of women; and c) how digital capitalism otherize the pregnant from discourses about sex; the sexual workers from family; and all women and men from the digital capitalist system.
Paper short abstract:
With the case of Rohingya refugees in Malaysia, this paper critically examines the constructions of alterity and biopolitics inscribed into assemblages of governing technologies, like the compulsory MySejahtera contact tracing app during the COVID-19 pandemic.
Paper long abstract:
Akin to other governments around the globe, Malaysia responded to the COVID-19 pandemic with an assemblage of emergency public health policies, movement restrictions, and digital technologies. Besides decidedly analogue measures like barbed wires in the streets, this also included the country's compulsory contact tracing app MySejahtera; which the government touted as a form of digital, 'participatory surveillance' that would allow its citizens to help containing the spread of the virus.
MySejahtera’s Bluetooth tracing and QR-code check-ins in public spaces made the majority of the population digitally visible, which was thus allowed gradually more mobility after the initial lockdowns. The same technology, however, made populations like the unrecognised Rohingya refugees less visible, and hence exposed them to increased movement restrictions; accompanied by online vitriol casting refugees as contagion clusters.
Drawing on our analysis of interviews with Rohingya refugees, government statements, and documentation materials of the MySejahtera app, we argue that the socio-technical imaginary underpinning MySejahtera constructs its intervention target in epidemiological terms as a population under viral threat, spatially (co-)present within Malaysia's borders. However, the (pre-pandemic) construction of refugees as a potentially dangerous, more contagious ‘other’ is equally inscribed in both the contact tracing app and complementing public health measures (like the enhanced movement restriction zones that targeted areas with large refugee populations).
We thus propose to understand apps like MySejahtera as technologies of power that co-produce new subjectivities of populations under viral threat, and, simultaneously, re-enact and enforce alterity for those not granted the ‘privilege’ of digital ‘participatory surveillance’.
Paper short abstract:
The paper discusses data sharing practices on human mobility in and from West Africa. Drawing on extensive interviews, it offers empirical insights into national and regional data sharing systems. Emphasis is placed on the influence of risk, potentially leading to exclusionary practices.
Paper long abstract:
While the datafication of human mobility is a globally pervasive trend, it remains deeply influenced by contextual nuances that offer valuable insights into the (dis)continuities and ambivalences of this process. Moving from an infrastructural understanding of data circulation, this paper aims to present a cartography detailing the actors, data flows, and infrastructural entanglements shaping the datafication of human mobility in and from West Africa.
Drawing upon a hundred interviews conducted across Senegal, Mauritania, and Ivory Coast, this study seeks to provide an empirical contribution to the understanding of data sharing systems pertaining to mobility at both national and regional levels. By tracing the fragmented yet interconnected threads of this evolving landscape, it sheds light on the rapidly changing dynamics therein.
The analysis unfolds along three key axes. Firstly, it delves into the circulation of data concerning human mobility, spanning civil registries, bordering practices, and humanitarian assistance, discussing the ambiguities of this continuum. Here, a comprehensive mapping of involved actors, as well as the programs, agreements, and digital infrastructures mediating their interactions, is proposed.
Secondly, the study explores the diverse and overlapping purposes driving data circulation (statistical predictions, situational awareness, prevention strategies, identification processes). Emphasis is placed on how the notion of risk shapes such circulation, potentially resulting in marginalising, exclusionary and de-humanising practices.
Finally, situated within a region marked by porous borders, this analysis aims to reflect on the inherent frictions of the coexistence of datafied and alternative forms of mobility governance, often resulting in simultaneous data voids and overflows.
Paper short abstract:
Paper asks how adoption of electronic IDs compounded by biometric data enacts, translates, reproduces or obscures struggles over identity, deliberate ambiguity, belonging and othering in Libya along cleavages of ethnicity, mobility, and loyalty, based upon qualitative evidence.
Paper long abstract:
The identity of Libyans has long been a thorny issue, linked to (post-)colonial legacies as well as territorial and distributive access rights. Under the Gaddafi regime, the politics of identity was shaped by arbitrariness and authoritarianism, which allowed for flexibility and ended in a constant legal grey area. Since the 2011-revolution, the new Libyan authorities backed by foreign partners resolved to address the issue by technocratic means. These included the planned adoption of electronic IDs, compounded by biometric data, stored in a digital database and connected to border authorities, migration management, health services as well as electoral and civil registries (ElAswad & Jensen 2016; Abdullah et alii 2019). Amidst a fragile transition to a liberal state order, such a technological and legal infrastructure sits at the intersection of several contentious governance domains, including border controls, migration management, electoral systems. The paper explores the inherent tensions between the governance of infrastructure and governance by infrastructure (Pelizza 2023) of this unended process of state-building. It asks how the infrastructure enacts, mediates, translates, reproduces or obscures struggles over identity and deliberate ambiguity, as well as belonging and othering, most notably along the cleavages of ethnicity, mobility, and loyalty. To this end, the paper discusses qualitative evidence (to be) collected through interviews with international and domestic actors in Italy and in Libya.
Paper short abstract:
Far from being just a political ideal of public consultation, Democracy is an infrastructure that travels and transforms through history. In the 1960s, Democracy «going South» lead to the invention of civil biometrics : the birth of a system of remote surveillance of Africa's new citizens.
Paper long abstract:
It is a shared knowledge, by ordinary people, academics, and journalists that the use of biometrics to register voters in Africa (the subject of my doctoral dissertation) is something ontologically contrary to the ethics of normal democracy. It seems that biometrics is a surveillance technology that has nothing to share with empowerment, voting and democracy. In this presentation, I illustrate the emergence and globalization of biometric technologies as the result of the globalization of democracy in the 1960s. In this decade, African countries achieve independence from colonial powers, begin to establish the infrastructure of the modern state, self-govern themselves through the medium of universal suffrage. The arrival of democracy in Africa also determines the entry of African peoples into the global citizenship regime. Drawing on interviews, ethnographic observations, and institutional literature, I illustrate first, some elements that suggest that the Africanization of democracy was the driving force behind the rise and reinvention of biometrics as a tool for managing the identities of civilian citizens (and not just criminals as was the case up to that time). Second, I illustrate the rise of the biometrics industry as a tool for managing two sets of tasks related to sovereignty: democracy in Africa (voting and the management of identification documents) and the control of European borders, thus the management of the circulations of African populations. The presentation contributes to reframing democracy as an industrial power structure that induces the production of the technologies of otherness.
Paper short abstract:
In this paper I argue that to fully understand how digital technologies function, we should move away from elite contexts, and look at how they function in majority worlds. I argue that it is exactly in such contexts, that we will see how we can move towards more just tech, for all.
Paper long abstract:
In this paper, I rethink residents of Kibera, a low-income neighborhood in Nairobi, and often referred to as ‘Africa’s largest slum’ as active agents towards a more sustainable development and uptake of new digital technologies. Based on nineteen months of fieldwork in Nairobi, and more specifically in Kibera’s tech environments, I show how Kibera has become fully digital, yet, its inhabitants are often still considered passive users, and as marginal or peripheral (‘Other’) to the tech ecosystem. Nairobi is considered one of Africa’s leading tech ecosystems, and as such an example for innovation and digital ICT development. Yet, at the same time, it is estimated that sixty percent of Nairobi inhabitants reside in low-income areas. In many tech ecosystems, these people are considered ‘in need’, and technology is seen as a way to ‘disrupt’ their predicaments for the better. To counter this narrative, I argue that low-income area residents are not the ‘Other’ at all, but instead the majority. In such a way, understanding how digital developments play out in ‘majority worlds’ is crucial to understand how technology is taken up and appropriated away from elite contexts. Building on the example of how young people employ ‘YouTube University’ as an imagined alternative for formal education, I make the case that ultimately, all technology is situated, and as such, leaving out majority world residents and their agentive uptake would be a crucial mistake if we want to move towards more just technologies.
Paper short abstract:
The EU’s AI Act had provoked strong resistance by civil society actors concerned with digital rights, migrant rights, and racial discrimination. We analyze NGO’s accounts of acting as spokespersons and political representatives for people on the move and contesting hegemonic scripts of alterity.
Paper long abstract:
The Council of the European Union and the European Parliament’s negotiators reached a provisional agreement on the AI Act on 9 December 2023. The AI Act had provoked strong resistance by civil society actors concerned with digital rights, migrant rights and racial discrimination since the regulatory process had started in 2021. NGOs joined forces in the alliance “Protect Not Surveil” and aimed at influencing the political negotiations getting enrolled as spokespersons for people-on-the-move contesting fundamentally the assumptions about “the other”.
In this paper, we explore controversies about the AI Act regulation that is closely entangled with superimposed technopolitical networks of migration and border control regimes and that provoked political engagement of civil society actors. Drawing on Akrich (1992) and Lee and Brown (1994), we conduct an analysis through the lens of scripts and scripting and study both hegemonic scripts of alterity and counter-attempts of “re-description” by civil society engagement. Based on the analysis of document and interviews, we first study the dominant scripts of alterity in AI Act regulatory documents that define the relations between AI, security, and marginalized populations. Second, we examine NGOs’ attempts and claims to dismantle, diversify and ‘re-describe’ these scripts of alterity and analyze how NGO staff understand their relationship and responsibility to advocate for people on the move. Third, we analyze the ambiguous role of NGOs as representatives of othered non-EU-citizens who not necessarily aspire to be represented or to lend their voice. Finally, we discuss careful approaches of enrollment and spokespersonship.