Log in to star items.
- Convenors:
-
Nil Uzun
(RWTH Aachen)
Jutta Weber (University Paderborn)
Send message to Convenors
- Format:
- Combined Format Open Panel
Short Abstract
The combined sessions will confront military technoscience and challenge STS to examine algorithmic warfare, in terms of autonomy, data-driven targeting, and machine-speed decision-making, as well as their imaginaries and effects to transform critique into action for livable futures.
Description
In recent years, the accelerating development and deployment of algorithmic and computational technologies in military domains, from autonomous weapon systems and drone swarms to network-enabled command and control and data-driven targeting, have brought military technoscience to the forefront of public debate. Meanwhile, defense budgets are rapidly expanding across Europe, and military tech start-ups are gaining global prominence. The result is an expanding zone of “software-based war,” where speed, datafication, and automation are reshaping how violence is programmed and enacted, partially with devastating consequences, accompanied by ever-increasing societal militarization. Yet, the militarization of digital and algorithmic infrastructures, and of society at large, remains undertheorized and often sidelined within the broader STS community.
Therefore, we seek to provoke a critical intervention by asking:
- How do concepts like meaningful human control, accountability, and proportionality travel (or fail) at machine speed?
- In what ways do the logics of speed, data-driven decision making, drone swarms, and experimental warfare challenge STS concepts of agency, responsibility, and non/human relations?
- How do the imaginaries underpinning military AI reproduce or challenge geopolitical and epistemic asymmetries?
- What could the unique contribution of critical (feminist/postcolonial) STS be?
We will address these questions in three sessions, which align with the conference theme of “resilient futures” by insisting that creating livable futures requires confronting military technoscience and its epistemologies, imaginaries, and patriarchal and colonial residues. The two open panels will invite a broad range of empirical, theoretical, and methodological contributions bringing together discussants across STS, critical security/military studies, critical data studies, feminist/postcolonial theory, journalism, and activism. The roundtable will serve as a “call to action,” offering a space to reflect collectively on how STS can engage more systematically with today’s military technoscience through our research agendas and methodological tools, but also considering ways to intervene effectively in the world-making and world-unmaking practices of the military-industrial complex.
Accepted contributions
Session 1Short abstract
Drawing on theories of the aperture and sociotechnical histories of computer vision, this paper explores whether open source investigations inadvertently reconstruct militaristic modes of seeing and, by extension, reaffirm epistemic asymmetries concerning the recognition of violence.
Long abstract
Situated at the juncture between investigative journalism and human rights work, Open-Source Investigation (OSI) is now well-established as a practice that uses publicly available data – open sources – to investigate disputed events, often in relation to military violence. Highly mediated and technical in its operations, OSI relies on an array of computational tools and infrastructures – which increasingly includes algorithmic technologies such as computer vision – to assemble public data into “anti-hegemonic” narratives that challenge military and state power (Fuller & Weizman, 2021).
Taking a more critical perspective, this paper probes the limits of OSI’s capacity to intervene in the technoscientific imaginaries of software-based war. Our response focuses on how computer vision systems are used to process vast quantities of image data, both in experimental drone targeting systems and recent OSI projects aiming to reveal patterns of violence in conflict zones. Drawing on sociotechnical histories of computer vision (Dobson, 2023) and theories of ‘apertures’ (Amoore, 2020), we explore how such systems function by enabling certain forms of “seeing” while necessarily closing down others.
Taking the Vermeer drone-camera system and the Tech4Tracing OSI toolkit as case studies, we reflect on the technicality of computer vision and the aesthetic forms it produces. In doing so, we contend that it affords a distinctly militarised imaginary of space, bodies, and objects. We suggest that efforts to investigate military operations using such tools can inadvertently reconstruct militaristic modes of seeing and, by extension, reaffirm epistemic asymmetries concerning the recognition and acknowledgement of violence.
Short abstract
Generative AI is transforming military wargaming into a system that produces synthetic futures of war. This paper argues that LLM-enabled wargaming creates a “fiction of control” while embedding algorithmic authority within a new political economy of military AI and software-based warfare.
Long abstract
(open panel) Contemporary debates on military artificial intelligence largely focus on autonomous weapons and the automation of lethal force. This paper shifts attention to an earlier and insufficiently theorized site of algorithmic warfare: AI-enabled military wargaming. With the integration of Large Language Models (LLMs), wargaming is increasingly transformed into a generative simulation environment that produces not only scenarios but coherent worlds of strategic plausibility, thereby shaping how military futures, threats, and responsibilities are imagined in advance of violence.
The paper conceptualizes this transformation as a fiction of control: a socio-technical formation in which human oversight is rhetorically affirmed, infrastructurally embedded, and visually staged, even as epistemic authority shifts into algorithmic systems. Rather than disappearing, meaningful human control is reconfigured into a relational form that remains symbolically visible while becoming epistemically attenuated. To situate this development within the broader expansion of software-based war, the paper examines the emerging political economy of AI-enabled wargaming. It further analyzes which actors develop LLM-based wargaming systems and how boundaries between technology start-ups, defense contractors, and state actors are increasingly blurred through dual-use narratives and militarized AI infrastructures.
Building on STS scholarship on algorithmic perception and distributed agency and analyses of the technopolitics of war, the paper argues that generative wargaming functions as an epistemic infrastructure of militarized world-making. In this emerging military-AI ecosystem, algorithmic systems do not simply support decisions; they increasingly shape the conditions under which war becomes knowable, imaginable, and governable.
Short abstract
The paper identifies key debates surrounding emerging military technologies and traces counter-mobilization across different organizational forms. It explores how actors frame the problem, envision alternative futures, and articulate notions of justice, accountability, and peace.
Long abstract
This paper presents preliminary findings from the FairComp research project, which explores international initiatives mobilizing against the militarization of computational technologies. As artificial intelligence, computing power, data infrastructures, and autonomous systems increasingly shape contemporary warfare, promising speed and precision, the project investigates how such promises travel and how they are contested in multiple domains. The analysis maps a constellation of actors, including tech workers, whistleblowers, artistic initiatives, advocacy groups, campaigns, and expert networks, who raise concerns about the military applications of emerging technologies, such as autonomous weapons, algorithmic warfare, and so-called killer robots, and their political, discursive, and organizational strategies through which resistance to these developments is articulated.
The paper identifies key debates surrounding military-technological development, particularly in terms of novelties and continuities, and traces the socio-material practices of counter-mobilization across different organizational forms. Additionally, it explores how these actors frame the problem, envision alternative futures, and articulate notions of justice, accountability, and peace. Empirically, the paper draws on qualitative data from participant observation, document analysis, and informative interviews; conceptually, it engages frameworks from social movement theory, global sociology, and STS. The primary goal of this presentation is to expand critical debates on algorithmic warfare beyond regulation of an assumed trajectory, toward understanding the contentious politics that actively question what these systems are for, who sets the terms, and which futures they make possible or foreclose.
Short abstract
CCW deadlock exposes widening fractures between software, hardware, and data in autonomous systems, while semantic battles over “meaningful human control” obscure deeper accountability failures built into their design and deployment.
Long abstract
The stalled progress of the Convention on Certain Conventional Weapons (CCW) negotiations on autonomous weapons exposes a deepening structural gap at the heart of contemporary algorithmic warfare: the divergence between software, hardware, and data. While states debate abstract principles, such as “meaningful human control,” the material infrastructures that enable autonomous targeting—sensor arrays, compute architectures, training datasets, and increasingly automated decision‑cycles—continue to evolve at a pace that outstrips diplomatic deliberation. This gap is not merely technical; it reflects differing ontologies and legal imaginaries. Software logics privilege speed, iteration, and probabilistic decision-making. Hardware development follows geopolitical supply chains and industrial constraints. Data practices shape the epistemic boundaries of what systems “see,” classify, and act upon. Yet CCW debates fold these complexities into semantic disputes over the proper role of the human, obscuring the underlying distribution of agency across computational, material, and environmental systems.
My research highlights how this tripartite gap destabilizes established legal doctrines and risk frameworks. The fragmentation of responsibility across code, components, and datasets erodes accountability, while the ecological and multispecies impacts of sensorized conflict environments remain largely unaddressed. The result is a diplomatic process in which states attempt to regulate a system they do not analytically disentangle, using concepts—control, judgment, intention—that presuppose a human‑centric model already misaligned with planetary‑scale, data‑driven infrastructures.
This panel contribution argues that bridging these gaps requires rethinking legal oversight through ecosystemic, posthuman, and risk‑sensitive approaches capable of addressing how autonomous systems actually operate—rather than how diplomatic language hopes they still do.
Short abstract
The contribution shows how facial recognition evasion rooted in military camouflage reveals hidden military influence on art and socal theory. It proposes a framework of secrecy and (in)visibility to examine the interconnectedness of military practices and sociological theory.
Long abstract
Facial recognition evasion tech like Shieldwear (Harvey 2017) artistically disrupts algorithmic control, yet its camouflage roots trace to military origins. This contribution examines such interference moments in digital networks through "a-relationality" as a social mode, foregrounding the military's under-theorized role in sociological inquiry. This is just one example of how intertwined military, social, and artistic processes are and how little obvious this often is. This contribution addresses precisely this gap. It aims to highlight the significance of the military for sociological theory building by drawing on various imaginaries of secrecy (Caygill 2015; Horn 2011) and (in)visibility (Yamamoto-Masson). This socio-material analysis of military camouflage histories, Shieldwear designs, and subversive art projects reveals mutual causality in networks of agency and patriarchy. Military hegemonies shape disruptions (e.g., evasion as inverted control), while artistic appropriations subvert them into state-eluding resistance. Sheding light on the military logics in social theory illuminates actorship beyond human-centered views, challenging patriarchal structures in surveillance societies. This framework urges sociologists to "follow the military" in tech-art entanglements, offering tools for critiquing persistent hegemonies.
References:
Caygill, H. (2015). Arcanum: The Secret Life of State and Civil Society. In D. Dwivedi & S. V. (Hrsg.), The Public Sphere From Outside the West (S. 21–40). Bloomsbury Publishing.
Harvey, A. (2017, April 1). Adam Harvey—Stealth Wear. https://adam.harvey.studio/stealth-wear/
Horn, E. (2011). Logics of Political Secrecy. Theory, Culture & Society, 28(7), 103–122. https://doi.org/10.1177/0263276411424583
Yamamoto-Masson, N. (o. J.). On Disappearance – σ and Strategic Withdrawal From Surface Monitoring.