Log in to star items.
Accepted Paper
Paper short abstract
Facial AI in biomedicine and healthcare raises ethical and social questions that are framed differently across disciplines. This paper traces controversies over harm, authority and facial proxies, and how ethical repertoires define what counts as ‘good’ care.
Paper long abstract
As AI-based facial technologies are explored for diagnosis, monitoring, triage and identification across biomedicine and healthcare, faces are enacted as a biometric–clinical interface through which clinical judgement, data infrastructures and surveillance logics co-produce one another. Drawing on a mapped corpus of academic publications (2016–2025) addressing the social and ethical implications of facial AI in these settings, this paper reads the literature as a sociotechnical arena of problematisation in which harms, subjects, responsibilities and anticipatory futures are performed.
We show that “ethics” is not a single register but a set of heterogeneous enactments that privilege particular forms of evidence and specify how concerns should be operationalised and audited. Across much of the corpus, ethics is translated into compliance-ready framings (e.g., safety/clinical risk, privacy/data governance, autonomy/consent), while justice-oriented concerns (distributional, epistemic and structural), stigma and appearance norms, dehumanisation, and patient/public perspectives persist as thinner attachments. We conceptualise these patterned differences as ethical repertoires: competing regimes of worth for what counts as “good” care.
Tracing where repertoires intersect and collide, we identify three recurring controversies: (1) what counts as harm (measurable error and risk versus social meaning and lived experience); (2) who is authorised to perform ethical labour in decisions to deploy, pause or withdraw; and (3) facial reductionism, whereby features or expressions are stabilised as proxies for inner states, risk or worth. We argue that these controversies turn ethics into governance devices: boundary objects that delimit what becomes measurable and actionable within evidentiary infrastructures as facial AI is made implementable.
Encoded Bodies: Biometric Medicine and the Surveillance of Human Life
Session 1