Log in to star items.
Accepted Paper
Paper short abstract
Predictive AI tools promise more efficient healthcare but are tightly regulated in the EU. Studying “in-house” regulation around AI implementation in Danish hospitals, this paper shows how compliance with data protection might sideline debates about the epistemic ethics of precision technologies.
Paper long abstract
In healthcare, algorithms developed under the label of “AI” are expected to predict future medical events more accurately than clinicians. As precision technologies, predictive algorithms promise to improve clinical decision-making and enhance the efficiency of healthcare delivery. However, the EU AI Act classifies many such healthcare applications as high-risk, subjecting them to extensive regulatory oversight that clinical environments often experience as an obstacle to research and innovation assumed to benefit patients. In 2023, the European Commission’s Medical Device Coordination Group issued guidance allowing hospitals and other health institutions to manufacture and use medical devices—including AI-based tools—on a non-industrial scale within the same legal entity, exempting these “in-house” devices from the requirements set out in the Medical Device Regulation. Drawing on qualitative interviews with regional employees approving AI-based tools and with clinicians working with a predictive algorithm in Danish healthcare, this paper explores the ethical tensions emerging when predictive tools are deployed under the in-house provision. We show that regional employees orient toward compliance with data protection and privacy legislation to avoid public controversies around data misuse. This compliance-driven focus, we argue, risks sidelining broader debates about the epistemic ethics of precision technologies: the knowledge predictive algorithms produce, the clinical practices they foster, and the purposes this knowledge can legitimately serve. In the absence of institutional spaces for ethical deliberation, employees may resort to collective ignorance, avoiding involvement with predictive tools they suspect could enable ethically problematic practices such as the deprioritization of patients at the margins of life.
Technologies of precision: Exploring the meanings, practices, and politics of precisioning tools across healthcare, agriculture, and warfare.
Session 1