Log in to star items.
Accepted Contribution
Short abstract
CCW deadlock exposes widening fractures between software, hardware, and data in autonomous systems, while semantic battles over “meaningful human control” obscure deeper accountability failures built into their design and deployment.
Long abstract
The stalled progress of the Convention on Certain Conventional Weapons (CCW) negotiations on autonomous weapons exposes a deepening structural gap at the heart of contemporary algorithmic warfare: the divergence between software, hardware, and data. While states debate abstract principles, such as “meaningful human control,” the material infrastructures that enable autonomous targeting—sensor arrays, compute architectures, training datasets, and increasingly automated decision‑cycles—continue to evolve at a pace that outstrips diplomatic deliberation. This gap is not merely technical; it reflects differing ontologies and legal imaginaries. Software logics privilege speed, iteration, and probabilistic decision-making. Hardware development follows geopolitical supply chains and industrial constraints. Data practices shape the epistemic boundaries of what systems “see,” classify, and act upon. Yet CCW debates fold these complexities into semantic disputes over the proper role of the human, obscuring the underlying distribution of agency across computational, material, and environmental systems.
My research highlights how this tripartite gap destabilizes established legal doctrines and risk frameworks. The fragmentation of responsibility across code, components, and datasets erodes accountability, while the ecological and multispecies impacts of sensorized conflict environments remain largely unaddressed. The result is a diplomatic process in which states attempt to regulate a system they do not analytically disentangle, using concepts—control, judgment, intention—that presuppose a human‑centric model already misaligned with planetary‑scale, data‑driven infrastructures.
This panel contribution argues that bridging these gaps requires rethinking legal oversight through ecosystemic, posthuman, and risk‑sensitive approaches capable of addressing how autonomous systems actually operate—rather than how diplomatic language hopes they still do.
Confronting military technoscience: STS, algorithmic warfare and livable futures
Session 1