Log in to star items.
- Convenors:
-
Estrid Sørensen
(Ruhr-Universität Bochum)
Olga Galanova
Send message to Convenors
- Discussant:
-
James Maguire
(IT University of Copenhagen)
- Format:
- Traditional Open Panel
Short Abstract
Data infrastructures have planetary effects, and they are key to knowledge production. This panel examines worldings around data centres and other infrastructures with a particular focus of how and when their planetary effects and the knowledge outcomes of data practices are co-enacted.
Description
The topic of harmful planetary effects of datacentres and that of questionable epistemic effects of AI have gained considerable attention. This panel thinks of epistemic and planetary effects together, and inquires the worldings their co-enactments generate. We attend to how knowledge produced by way of AI, computational analytics, big data, etc. are related to the planetary effects of their infrastructures, among others of datacentres. Not only scholars, but also organisations, discourses, and politics tend to engage with climate effects of data infrastructures separately from the question of their epistmiemic outcomes. What are the empirical moments and spaces, where epistemic and planetary effects of data infrastructures are enacted together? Such as in popular claims about the equivalence of tea cooking and power used for a GenAI prompt; when revealing that AI is both driven by fossil fuel, and used to discover how to extract oil; when scientists write sleek code that will both generate better simulations and run on less power. What characterises the worldings in which planetarity and knowing are constituted together, through data-oriented scientific, technological, and infrastructural or other practices? What worldings and practices prevent or restrain their co-enactment? What alternative modes of worlding open possibilities for thinking and infrastructuring otherwise in the face of planetary challenges?
We invite both empirical studies and conceptual reflections of such co-enactments and their worldings. We accept contemporary studies as well as historical research. Rather than assuming a linear trajectory from past to future, we propose to approach temporalities of planetary co-enactments as layered, recursive, and speculative. Contributions that address the entanglements of epistemic, material, planetary and temporal dimensions are welcome, just as studies that trace how different sites, scales, and temporalities of data infrastructures produce worldings of knowledge and the planet.
Accepted papers
Session 1Paper short abstract
This paper enriches our understanding of the powerful co-enactments of AI’s epistemic and planetary effects by offering a (queer-)ecofeminist and new materialist analysis of the techniques with which “sustainable AI”-technologies are (e)valuated in order to generate ‘green’ data worldings.
Paper long abstract
Artificial intelligence (AI) and sustainability unfold ambivalent intra-actions. While some promise to fix the climate crises with the help of AI, others emphasize the ecological costs of developing, maintaining and using AI. At the same time, there is growing (re-)search for so-called “sustainable AI” in industry and science. Against this background, this paper contributes to enriching our understanding of how ecological sustainability is enacted in current, techno-optimistic turns to AI.
Set at the intersection of (queer-)ecofeminism, new materialism, and (e)valuation studies, it offers a “careful analysis”(Puig de la Bellacasa 2011) of the techniques with which “sustainable AI”-technologies are (e)valuated in order to generate ‘green’ data worldings. To do so, it engages with two exemplary cases of “sustainable AI”-technologies – an App that allows to rank generative AI-tools according to their sustainability, and a ‘green’ data center – asking:
- How is sustainability (e)valuated with regards to these AI-technologies, what are the techniques and methods used to (e)valuate sustainable AI?
- What counts as sustainable, what factors in and what is left out, and who decides what is considered sustainable?
- Which values, norms, and interests are cared for in these (e)valuation procedures, and which are neglected or marginalized?
- What are the epistem-ontological and political assumptions and effects of these forms of (e)valuating “sustainable AI”?
In engaging with these questions, the paper wants to “generate care” (Puig de la Bellacasa 2011) for the powerful semiotic-material entanglements of technoscience, materiality and power in more-than-human worlds.
Paper short abstract
The deployment of digital twins in data centers promises to optimize performance efficiency, yet the recursive, data-driven processes of real-time monitoring, simulation, and visualization require substantial computing power, linking operational gains to market logics.
Paper long abstract
While expanding on a planetary scale to serve the growing digital economy, data centers function as laboratories for continuous improvement of performance efficiency, shaped by economic imperatives and regulatory pressures. The recent rapid escalation of AI workloads has amplified energy consumption and introduced new technical challenges for the design, operation, and management of digital infrastructures. Digital twins, dynamic virtual models of data center facilities that integrate real-time operational data with simulation environments, have emerged as tools for navigating this complexity. The incorporation of machine learning and immersive visualization techniques promises more precise, finely calibrated control over infrastructural processes across the facility life cycle.
Drawing on the development and deployment of digital twins in data centers and interviews with designers and engineers, this paper examines how performance efficiency is recursively produced through ongoing measurement, simulation, and operational adjustment. It situates digital twins within a longer lineage of computational modeling practices in the built environment. Extending the digital twin to encompass the full infrastructure, from servers and cabling to cooling systems, requires substantial computing power to maintain high model granularity and constant information exchange between the physical facility and its virtual representation. Gains in system performance, such as thermal dynamics and energy use, are realized alongside the environmental costs of the data practices themselves. Meanwhile, standardizing equipment, protocols, and data workflows enables providers to unify systems on a common platform, coordinate the supply chain, and leverage operational knowledge as a commercial asset, linking efficiency trade-offs to industrial and market interests.
Paper short abstract
This paper argues that encounters between expansionist imaginaries of limitless scaling and restrained computational approaches attentive to ecological limits create epistemic trading zones in which actors negotiate how the material foundations of AI are understood as part of an ethics of design.
Paper long abstract
Even as methane gas turbines and nuclear power plants come online to power AI data centers, the specific environmental harms potentialized by the expansion of AI’s material infrastructure remain largely sidelined in the race surrounding AI. Against such a backdrop, this paper calls attention to encounters between expansionary visions of AI and communities advocating computational restraint as they attempt to generate epistemic resources with which to understand and intervene in AI’s materiality.
Such epistemic resources are currently being constructed and circulated within communities gathered around such concepts as digital sufficiency, digital sobriety, and permacomputing, among other conceptualizations of computing otherwise. Drawing on semi-structured interviews with transnational networks of scholars and activists, alongside document analysis and ethnographic participation in meetings and conferences, this paper highlights what can be understood as epistemic trading zones (Galison, 1997): spaces in which actors from diverse computational backgrounds and normative commitments exchange calculative methods, affective responses to current AI developments, and manifestos calling for political and personal intervention.
By tracing these exchanges, the knowledge practices emerging through efforts to calculate, reduce, or otherwise reconfigure the consumptive relationship between expansionary computing and its environmental costs become central. In the context of friction-filled encounters with visions of industry-driven AI expansion, this paper explores how new forms of design ethics for computing are imagined and enacted when environmental concerns come to the fore.
This research is part of the ERC Advanced Grant project Innovation Residues: Modes and Infrastructures of Caring for Long-Term Environmental Futures (PI: Ulrike Felt, GA 101054580).
Paper short abstract
How do data infrastructures enact the planet in microbiome research? Drawing on ethnography at a European life sciences institution, this paper traces co-enactments across data centres, genomic databases, and AI-driven microbiome prediction.
Paper long abstract
What does it mean to know the planet through its microbes—and at what cost? As global environmental crises intensify, computational biology increasingly positions microbial communities as both indicators of and responses to planetary challenges. The drive to map and know these communities at scale—through metagenomic sequencing, large genomic databases, and satellite Earth Observation—has created opportunities to dream new ways of knowing and governing the planet. Yet the data infrastructures enabling this knowledge are themselves implicated in the planetary conditions they claim to monitor.
Drawing on twelve months of ethnographic fieldwork at a major European life sciences institution, this paper traces the moments and spaces where planetarity is enacted through microbiome digital databases. I follow three such enactments: in the data centre, where a planetary-scale microbiome database physically runs, its hardware warming as distant users query it, drawing on energy infrastructures whose geography mirrors broader geopolitical inequalities; in the work of colour, where fluorescent dyes, satellite spectral filters, and biodiversity visualisation palettes form a chromatic chain connecting molecules to pixels to predictions, dependent on energy-intensive infrastructure at every step; and in the production of synthetic data, where machine learning models trained on sampled landscapes generate predicted microbiomes for unsampled regions, making epistemic ambition and material costs inseparable.
Following an interlocutor's analogy between AI prediction and dreaming, I suggest that the dream of a digital twin of Earth's microbiome is shaped by particular geographies of infrastructure and power, generative of new forms of environmentality—and dreamed at considerable planetary cost.