Log in to star items.
- Convenors:
-
Zeki Can Seskir
(Karlsruhe Institute of Technology)
Adrian Schmidt (Institute for Technology Assessment and Systems Analysis (ITAS), KIT)
Send message to Convenors
- Format:
- Traditional Open Panel
Short Abstract
This panel invites contributions that critically and creatively explore how quantum technologies are imagined, developed, and governed, and how they might shape or challenge resilient and responsible futures.
Description
Quantum technologies are increasingly positioned as key enablers of future transformations across computing, communication, and sensing. Yet their societal meanings, expectations, and implications remain fluid and contested. This panel invites reflections on how quantum technologies are being shaped as sociotechnical projects, through research practices, policy discourses, infrastructures, imaginaries, and everyday engagements, and how these processes connect to broader discussions about responsibility, resilience, and desirable futures.
We welcome empirical, theoretical, and methodological contributions that examine quantum technologies as emerging sites for rethinking established STS concerns, including expertise, uncertainty, governance, temporality, and power. Possible entry points include (but are not limited to) studies of how “quantum” is made meaningful in scientific, political, or cultural contexts; how narratives of disruption, sovereignty, or sustainability inform their development; and how diverse actors and publics imagine, contest, or co-produce their trajectories.
By opening space for multiple interpretations of “resilience” and “responsibility,” this panel seeks to connect different strands of STS research interested in how technological futures are envisioned and enacted under conditions of global instability. Our aim is to foster dialogue across disciplinary, regional, and thematic boundaries, contributing to EASST 2026’s broader conversation on More than now: Exploring resilient futures.
Accepted papers
Session 1Paper short abstract
This paper examines visions of quantum computing: what are they envisioned to be, to do, and to do good for? Through an 'ethical vision assessment' I identify dominant visions of quantum computing and I explore the ethical issues they raise.
Paper long abstract
Visions of future technologies shape their development trajectories, making them important objects of ethical analysis. This paper examines contemporary visions of quantum computing and analyses the futures they project. Drawing on hermeneutic approaches to technology assessment, I develop an 'ethical vision assessment' to identify dominant visions of quantum computing and explore the ethical issues they raise.
Based on an extensive analysis of vision statements by key actors—focusing on value-laden terms, envisioned applications, associated domains, and anticipated technical affordances—the paper distils four dominant visions. Each positions the societal relevance of quantum computing differently: as a contributor to the public good, a scientific tool, an economic asset, or a geopolitical advantage. Across these visions, shared discursive patterns emerge that warrant ethical scrutiny: a strong techno-optimist orientation feeding into techno-solutionist logic; the prevalence of promissory language legitimising innovation agendas; and a discourse shaped by the interests of a narrow group of powerful actors. Together, these observations form a figurative heatmap of ethical concerns surrounding how quantum computing is currently envisioned, offering entry points for further ethical inquiry.
Paper short abstract
This paper investigates the bad actor narrative in Quantum Computing. This narrative attributes harms caused by the technology to malicious users with hostile intentions using the technology in unintended ways, rather than to the design, deployment, or intended uses of the developers itself.
Paper long abstract
The development of emerging technologies is often accompanied by socially constructed narratives that shape the perceptions of the technology but also the ethical responsibility. This paper is investigating one particular narrative, prevailing, among others, in the field of quantum computing: The bad actor narrative. This narrative attributes potential harms caused by the technology to malicious adversaries or users with hostile intentions using the technology in unintended ways, rather than to the design, deployment, or intended uses of the developers itself. The paper critically examines the ethical implications of the bad actor narrative, arguing that it produces three major problems: it narrows ethical reflection to concerns about misuse, shifts responsibility along the innovation chain, and overshadows broader debates thus focussing attention on one particular problem. Drawing on insights from Science and Technology Studies (STS), technology assessment, responsible innovation and the sociology of expectations, this paper proposes a way to approach this narrative and creating a shared responsibility that recognizes ethical agency at every stage of technological development. By deliberately embedding values into design processes and reclaiming the ethical imagination, developers and researchers can move beyond reactive threat models and actively shape more inclusive, just, and sustainable narratives of technological futures. Rather than viewing responsibility as a burden, this paper argues that it should be embraced as a source of empowerment, i.e. a critical condition for realizing the transformative potential of quantum technologies in alignment with the common good.
Paper long abstract
Quantum information technologies (QITs) – e.g., quantum computers, quantum communication, quantum sensing – are expected to bring profound disruptions in our societies (WRR 2021) constituting key enabling infrastructures for other devices, services, and applications, including AI (Timmers, 2023). In this regard, it is crucial to explore, since today, anticipatory (de Jong, 2022) precautionary (Taylor, 2020) approaches to the governance of QITs.
Normative proposals already exist (Perrier, 2022; World Economic Forum, 2022; Kop, 2025). Overall, these proposals rest on top-down path-dependent approaches which risk reinforcing today’s multipolar scenario, triggering the geopoliticization and siloing of QITs (Taddeo et al. 2024; Shelley-Egan & Vermaas 2025).
To avoid such trends, I follow up on Calzati and de Kerckhove’s (2024) ideas of QITs as pharmakological technologies (both poison and antidote; cf. Stiegler, 1998) and of a communitarian model for the governance of QITs. Notably, I outline a polycentric governance for QITs, informed by republican approaches to the digital transformation (Susskind 2022; Hoeksema, 2023; Calzati & van Loenen, 2023; 2025). These approaches, indeed, share the same concern for fostering digital polities within/through which mechanisms for guaranteeing power distribution across actors, mutual accountability via checks and balances, and forms of collegial decision and control are systemically devised. In fact, Calzati and de Kerckhove already mention the concept of a “quantum republic”; however, this remains at a high level of abstraction.
Here, I refine this idea, identifying and discussing interdependent sets of principles, bodies, and processes characterising a quantum republic governance model.
Paper short abstract
Our empirical study examines anticipatory governance for quantum computers by demonstrating how diverse stakeholders imagine quantum-resilient futures and contest (or fail to contest) their distributed responsibilities in the transition to quantum-resilient systems.
Paper long abstract
This work explores how the arrival of quantum computers challenges the resilience of our societies. As future quantum computers pose an urgent threat to global cybersecurity, a transition to quantum-safe systems is essential in order to remain resilient against quantum attacks. Such transition demands coordinated action across a complex system of stakeholders, revealing broader societal, organisational and institutional challenges in developing resilience. This empirical research investigates the Netherlands as a case study in how nations are anticipating and responding to the quantum threat.
Drawing on a participatory mapping workshop held with industry, government and academic experts, we examine the relevant actors, roles and responsibilities in the response to the quantum threat. Our analysis reveals that diverse stakeholders co-construct the quantum-resilient society and identifies highly distributed, interconnected transition responsibilities, several with unclear ownership. We examine the conditions of uncertainty in which organisations must plan their transition, develop two imagined quantum-resilient futures which emerged from the workshop and reflect on their implications for anticipatory governance for the impact of quantum computers.
Framing this transition as a societal challenge, we show how agency, expertise, and institutional coordination collectively shape societal readiness for the quantum threat, allowing us to examine why the current carrots (incentives) and sticks (penalties) are insufficient to prompt widespread transition. We assign the unclaimed responsibilities by advancing recommendations for several transition actors and conclude that, without a solid governance foundation, carrots will remain irrelevant, sticks will remain absent, and the quantum-resilient future will never arrive.
Paper short abstract
Ethical reflection in QT remains dispersed across policy, academia, industry, and community initiatives. Rather than consolidating into a distinct field, “quantum ethics” operates as a distributed governance practice under conditions of early-stage maturity and geopolitical salience.
Paper long abstract
Quantum technologies (QT) are increasingly framed as strategically significant emerging technologies. At the same time, calls for “responsible quantum” and “quantum ethics” are gaining visibility. Unlike artificial intelligence, however, ethical reflection in QT has not consolidated into a recognisable interdisciplinary field.
This presentation examines how ethical discourse around QT is currently structured and institutionalised. Drawing on bibliometric mapping, policy analysis, and review of community and industry initiatives, ethical engagement is shown to be characterised by structural dispersion rather than disciplinary consolidation.
Three dominant approaches can be identified: pragmatic risk governance (e.g., cryptographic disruption and security concerns), normative principle-building (responsible innovation frameworks), and critical contextual analysis (geopolitics, sovereignty, and innovation systems). Ethical reflection emerges across multiple sites — grassroots networks, academic scholarship, multilateral governance platforms, and corporate responsibility initiatives — without a stable institutional centre.
As a result, governance integration often substitutes for field formation. Ethical concerns are embedded within strategic roadmaps, policy architectures, and industry standards rather than institutionalised as an autonomous domain of inquiry.
Quantum technologies thus provide an empirical case for examining how ethical authority, coordination, and anticipatory governance are organised under conditions of early-stage technological maturity and high geopolitical salience.
Paper short abstract
This talk examines artistic narratives of quantum technologies as resources for imagining responsible futures. Drawing on hermeneutic technology assessment, it explores how artists and artistically engaged scientists articulate alternative quantum imaginaries beyond dominant visions.
Paper long abstract
Quantum technologies (QT) are still in an early stage of development, yet they are already accompanied by far-reaching promises concerning computing, communication, sensing, and broader societal transformation. This creates a well-known challenge for STS and technology assessment: while expectations around quantum technologies are highly visible, their concrete applications and social effects remain uncertain. In light of this tension, and against the background of the Collingridge dilemma, this presentation draws on hermeneutic technology assessment to examine how visions, narratives, and imaginaries can serve as productive entry points for engaging emerging technoscientific futures.
The contribution focuses on cultural contexts that have so far received relatively limited attention in STS debates on quantum technologies. It presents insights from practical work with artists who engage with QT, as well as with scientists working with who also use artistic practices. These actors inhabit a particularly generative space in which alternative meanings, expectations, and futures of quantum technologies are articulated, negotiated, and made sensible beyond dominant policy, industrial, or media discourses.
By analyzing these narratives and visions, the presentation explores how different futures of QT are imagined in present contexts, including personal, artistic, and societal futures. Particular attention is paid not only to questions of plausibility, but also to questions of preferability and desirability. In this way, the talk asks how culturally situated imaginaries may contribute to more responsible and socially robust trajectories of quantum technologies, and how they can enrich STS discussions of governance, anticipation, and the shaping of resilient futures.
Paper short abstract
Drawing on ethnographic research on large technology firms engaged in quantum computing development, the paper explores how imaginaries, strategies, and everyday innovation practices shape paths toward possible quantum futures, thereby navigating uncertainty and negotiating notions of responsibility
Paper long abstract
While breakthroughs in quantum computing (QC) could have transformative effects across society, reshaping sectors including finance, healthcare and cybersecurity while raising profound questions about privacy, security, and accessibility, many aspects of its practical viability remain uncertain. Although QC is theoretically well understood, the feasibility of scalable industrial applications that outperform classical computing remains unproven. This coexistence of potential, (over)promise, and ontological indeterminacy makes QC a fruitful domain for examining how organizations pursue emerging technologies when neither technical feasibility nor market viability can be clearly determined.
This paper forms part of an ongoing research project examining how large technology firms navigate uncertainty while creating pathways toward possible quantum futures. The study examines how firms construct strategies, allocate resources, and sustain commitment under these conditions, and how these visions and commitments are translated into everyday innovation practices. Because such pathways actively shape technological futures whose societal consequences remain uncertain, questions of responsibility become particularly salient.The presentation therefore focuses on what role responsibility, often articulated through notions such as “responsible innovation” or “responsible design”, plays in guiding these processes. How does it become embedded in innovation practices—whether as ethical reflection, strategic positioning, or a way of legitimizing technological commitments?
Drawing on ethnographic research in large technology firms engaged in quantum computing development, the study examines innovation as an ongoing socio-technical process (Garud & Turunen 2021). By tracing everyday design practices, strategic negotiations, and decision-making, the study shows how firms construct and stabilize pathways toward quantum futures while navigating uncertainty and competing expectations.
Paper short abstract
We examine how the extended developmental phase of quantum computing affects scientists' perceptions of their anticipatory abilities, and how scientific knowledge is affected by "anticipated disruptions" across three scientific disciplines: weather forecasting, public health, and cryptography.
Paper long abstract
In this paper, we examine how the extended developmental phase of quantum computing affects scientists' perceptions of their anticipatory abilities, and how scientific knowledge is affected by "anticipated disruptions" across three scientific disciplines: weather forecasting, public health, and cryptography.
We stand at a unique moment in technological history: quantum computing's transformative capabilities are theoretically established and widely acknowledged, yet practical implementation remains elusive. This temporal gap between knowledge and operationalization creates a distinct sociotechnical phenomenon worthy of systematic investigation, regarding the nexus between uncertainty, preparedness, scientific knowledge, and future imagination. Unlike previous technological revolutions, in which understanding and deployment emerged concurrently, quantum computing forces scientific communities to navigate an extended period of becoming futures and anticipatory governance, that is, planning for disruption that is both emerging and indefinitely delayed.
We investigate how experts across domains and contexts conceptualize quantum computing's timeline and operational readiness; how they balance current research priorities against future quantum-enabled possibilities; what institutional preparations, if any, they are undertaking despite quantum computing's non-operational status; how they navigate notions of uncertainty about implementation timelines while maintaining research agendas and securing funding; and how those contexts shape their assessment of when and whether quantum disruption demands present action, and the way is which doing scientific "knowledge" itself is disrupted.
Paper short abstract
Drawing on the French case, we suggest that "quantum" has become an "order-word" (Deleuze and Guattari) for the state, some scientists and market players. We highlight three of these implications: an antipolitics irreversibilization process, a nationalist fervor and a depolitization of knowledge.
Paper long abstract
The quantum computer has become a buzzword that has sparked enthusiasm among certain computer scientists and physicists, start uppers, industrialists, and administrative officials, who have come together to form a “trading zone” (Galison) around the promise of a revolutionary technology that will soon be available. During an investigation of the French “quantum ecosystem”, we realized that the question of “why” quantum computers should exist was never asked and that the notion of “second quantum revolution” acts as an “order-word”, a collective assemblage of enunciation belonging to indirect discourse (Deleuze & Guattari). Based on document analysis, observations and interviews conducted in a laboratory, within government agencies, and among market players, we examined how such order-word has come out, without public debate. We will highlight 3 “matters of concern” (Latour):
1-Making quantum computer irreversible through state investment: We have analyzed the rhetoric of irreversibility, and the reasoning through start-up road maps and the technology readiness levels method.
2-Technonationalism dissarranging Europe: Unlike works that have shown how Europe has been built through infrastructure, investment in digital technologies, and regulation, the global race for the quantum computer reveals efforts to undo Europe. We will show France’s imperialist vision of Europe and the envenomed notion of digital sovereignty.
3-Depolitization of knowledge: We will illustrate how scientists limit their reflexivity by creating boundaries between their work and the uses to which it could be put.
We aim at redirecting our collective capacities towards democratic issues and set the history of computing on a more reflexive course.
Paper short abstract
This article presents a Constructive Technology Assessment-oriented analysis of researchers' perceptions and assessments of quantum technology at its current stage. The results show continuous assessment through meaning-making, risk-benefit attribution, and perceptions of technological readiness.
Paper long abstract
This article presents a Constructive Technology Assessment (CTA)-oriented analysis of how quantum technology (QT) is perceived, interpreted, and assessed by researchers at its current stage of development. Positioned as a preliminary CTA analysis, the study conceptualizes the research domain itself as a primary site of ongoing assessment, rather than examining societal impacts or developing socio-technical scenarios. Based on the qualitative interviews with QT researchers across diverse institutional and national contexts, the article explores how researchers informally assess QT through their everyday research practices. Building on scholarship on technology futures, the analysis shows that researchers engage in continuous, implicit assessment of QT development along three interrelated dimensions: meaning-making (how QT is framed and positioned), risk-benefit attribution (how uncertainties and promises are negotiated), and perceptions of technological readiness (how progress and maturity are interpreted). The results indicate that meaning-making around QT is characterized by a tension between wide promises that generate uncertainty and pragmatic interpretations that restrict QT’s potential. Additionally, risk and benefit assessments are clustered around three overlapping modes of evaluation: objective framings, positional interpretations, and power-sensitive perspectives. Lastly, perceptions of technological readiness reveal competing temporal imaginaries within the research community, ranging from expectations of short-term demonstrators to long-term infrastructural transformation. Therefore, understanding these early-stage assessments provides a necessary foundation for subsequent CTA work that may engage broader societal actors, governance processes, and scenario-building exercises.
Paper short abstract
The paper examines the novel medium of “quantum art” as a superstructural expression of current quantum policies. Wrapped around a case study of artworks by Refik Anadol, Pierre Huyghe and Black Quantum Futurism, the presentation interrogates the risks and potentialities of quantum technologies.
Paper long abstract
“The Second Quantum Revolution” designates the current era of Quantum Technology marked by operationalisation of the principles of quantum mechanics to perform complex computational tasks and solve problems intractable for classical computers. Frequently framed as an “uncharted new territory” sparking the “race to win technological advantage,” the technology is however still at the NISQ (Noisy Intermediate-Scale Quantum) stage. The chief problem Quantum Technologies are facing is noise, which pushes a quantum system into decoherence, for example through an interference of the outside milieu. Despite these concerns Quantum Technologies are presented as the only possible, inevitable future.
The paper examines what is currently entailed by “quantum (media) art,” which either leverages the principles of quantum mechanics for artistic purposes while remaining squarely within the constraints of current quantum policies, or treats quantum technologies as a conceptual provocation, allowing a disruptive rethinking of the pursuit of technological progress. Wrapped around a case study of digital artworks by Refik Anadol, Pierre Huyghe and Black Quantum Futurism, the presentation is pre-emptively asking: What could be the possible shape of these new communications technologies? Would these be McLuhan’s pull or push media? And with what politics—with a radical edge serving social justice, decolonisation and anti-capitalism? Or an extension of the ruling techno-feudalism and imperial mindset?
Paper short abstract
Quantum communication promises to transfer trust from human organisations to physics. Drawing on ethnographic research conducted across European labs as part of the European Quantum Communication Infrastructure project, I show how photons become trusted actors in secure networks.
Paper long abstract
Communications involve connecting two or more parties through a network of actors mediated, generally, by a telecommunications provider. Since the end of the last century, however, a new quantum communications technology has begun to be developed which, in theory, shifts trust away from external human organizations toward the physical properties of quantum mechanics, particularly the characteristics of light, such as the no-measurement and no-cloning theorems.
Based on ethnographic research conducted in several laboratories across Europe, including interviews, document analysis, participant observation, and participation in conferences and seminars, I analyze how the European Quantum Communication Infrastructure (EuroQCI) is being built. The results show that this technology emerges as a mechanism to ensure that the control and interception of information are restricted only to the agreed actors. To achieve this security, light and photons become the “trusted” actors that guarantee it.
Through this case, I show how different actors attempt to build trust by positioning light as a reliable intermediary in quantum communication systems. Constructing photons as trustworthy actors requires assembling a heterogeneous network of components, single-photon detectors, optical fibers, satellites, quantum repeaters, cryogenic systems, and even classical communication channels, whose coordination must remain largely invisible for the promise of secure communication to hold. In this sense, physics does not replace the politics of trust; rather, it relocates and redistributes it across an assemblage that is, paradoxically, far more human than its promoters often acknowledge.
Paper short abstract
European quantum governance promises resilient futures but deploys instruments that undermine the collaborative ecosystems on which quantum innovation depends. We trace this temporal paradox and show how sovereignty prioritisation turns the pursuit of resilience into a mechanism of fragility.
Paper long abstract
Quantum technologies challenge governance through inherent uncertainty and hard-to-predict development trajectories. European decision-makers and organisations such as the OECD and WEF call for anticipatory governance, grounded in the assumption that proactive instruments can secure resilient technological futures by addressing risks attributed to the technology's assumed dual-use character while protecting strategic autonomy. We argue that the governance instruments deployed to fulfil this promise systematically destabilise the conditions they are built on.
Drawing on European quantum technology policy and strategy documents issued between 2016 and 2025, we trace a temporal paradox in governance design. The 2016 Quantum Manifesto positioned openness, collaboration, and transnational research coordination as foundations of European quantum capability. Within two years, governance priorities shifted toward export controls, strategic autonomy frameworks, and sovereignty-driven national strategies that thwarted the collaborative structures the Manifesto had introduced. Governance designed to protect future resilience fragmented and hindered present-day research ecosystems within their building process.
This paradox is not a policy mistake but a structural feature of how anticipatory governance operates under radical technological uncertainty. When governance systems attempt to govern emerging technologies, they default to categorical instruments inherited from previous technological eras, such as dual-use frameworks. Quantum governance reveals this pattern sharply: governance interventions that attempt to solve a problem by undermining its solution base.
We show in which cases pushing resilience becomes a mechanism of fragility, and how the prioritisation of sovereignty over collaboration, or collaboration over sovereignty, reshapes governance capacity in ways that neither orientation alone can anticipate.
Paper short abstract
AI's high clinical potential and low adoption created a technology adoption chasm. Quantum Technology's potential to remedy many of AI's problems, yet suffers from similar potential/reality paradigm. This papers investigates this problem by interviewing quantum scientists and medical practitioners.
Paper long abstract
The use of Artificial Intelligence (AI) in clinical settings holds immense promise, from exponential increases in diagnostics accuracy to unburdening of clinicians’ administrative tasks. AI revolutionizing every facet of medicine is a common talking point that fears of a “great doctor replacement theory” is taking hold. However, the replication crisis affecting scientific research is miring the translational prospects of medical AI. Very few applications are approved for clinical settings, and coupled with inherent issues plaguing AI models – bias, explainability, curve-fitting, lack of a universal set of ethical principles guiding their development, and lack of agreement on the distribution of responsibility – this poses serious challenges for clinical adoption. Finally, the immense compute requirements, reaching a trillion plus tokens to train certain models, pose serious roadblocks in AI development. All are possible problems Quantum Computing (QC) has the potential to solve, due to its ability to exponentially increase computing power, leveraging quantum mechanics and entanglement as a novel means of computing, with simultaneous, rather than consecutive operations, which cannot be achieved with classical computing. Yet the QC promise faces several challenges, from the statistical nature of its results, the noise-prone processes which have yet to be fully error-corrected, the inaccessibility of its hardware due to high costs, complexity, and expertise required to conduct Quantum Machine Learning research, creating monumental barriers towards conducting interdisciplinary research necessary to harness its potential in medical research, ultimately resulting in similar conversations about QC potential, and its ability to leap over the AI adoption chasm.
Paper short abstract
This paper explores how post‑quantum cryptography enacts risk as a sociotechnical futuring practice, revealing how openness, secrecy, and responsibility are negotiated as values in envisioning quantum and post‑quantum futures.
Paper long abstract
Development of quantum computing technologies has intensified concerns about the future durability of current cryptographic infrastructures. Post-quantum cryptography (PQC) has consequently emerged as a field dedicated to developing cryptographic schemes resistant to potential quantum attacks. While PQC is often framed as a technical response to a calculable future threat, this proposal approaches quantum risk as a sociotechnical problem in which futures, values, and responsibilities are actively negotiated. Inspired by an STS perspective and sociocultural approaches to risk, this proposal treats risk not as an objective property of hazards but as a social practice through which dangers are interpreted and made meaningful. Thus, emphasizing that risk emerges from social norms, institutional arrangements, and shared worldviews that shape which dangers become major and who is responsible for addressing them rather than from probabilistic assessments alone. In this sense, risk discourses do not only describe hazards but participate in stabilizing particular social orders and technological futures. Empirically, the focus is on the field of PQC as a site where possibly contrasting epistemic values within quantum technologies are re-negotiated. Here, the speculative threat of quantum computing destabilizes 'established' epistemic values within cryptography. Classical cryptographic practice has long privileged openness and transparency as the basis for trustworthy systems while quantum imaginaries foreground secrecy, strategic anticipation, and pre-emptive security. This proposal examines ethnographically how practitioners negotiate these values while enacting quantum and post-quantum futures.