Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Bettina Paul
(Universität Hamburg)
Torsten Voigt (RWTH Aachen University)
Larissa Fischer (RWTH Aachen University)
Send message to Convenors
- Stream:
- Measurement, commensuration, markets and values
- :
- Bowland North Seminar Room 10
- Start time:
- 27 July, 2018 at
Time zone: Europe/London
- Session slots:
- 2
Short Abstract:
Technologies measuring the human body are increasingly used in order to verify or falsify truth claims of a person. The panel seeks to discuss questions regarding the technological measurement of truth, its social framing and societal, ethical, and legal implications.
Long Abstract:
To distinguish between truth and lie is a key aspect of social life and highly relevant in different domains such as court trials, border controls, immigration procedures, or public hearings. In recent years, technologies measuring the human body are increasingly used in order to verify or falsify truth claims of a person e.g. through credibility assessment, lie detection, DNA family tests, linguistic analysis, or age assessments. These truth verification approaches often rely on biometrical but also other technological measurements collecting physiological, auditory or behavioural data.
The panel seeks to provide a forum to discuss questions regarding the technological measurement of truth, its social framing and societal, ethical, and legal implications. It seeks to map out different technological truth assessments in a comparative, historically and philosophically informed way and addresses questions such as: Which desires, theories and beliefs are inscribed in these technological systems? What kind of sociotechnical imaginaries can be extrapolated? How relevant is use of algorithms and 'big data' for these assessments? What role do classic inscriptions of neutrality, mechanical objectivity and scientific authority play in the differing practice settings, especially when it comes to the legitimation of new procedures? How do new technologies such as genetic tests or neuroscientific assessments relate to more traditional truth assessments such as blood analysis, or polygraph tests? The panel also invites papers which look at the non-knowledge that is produced in the course of truth measurements.
Accepted papers:
Session 1Paper short abstract:
Transnational exchange of DNA data for fighting criminality is being reconfigured as a kind of "truth machine". This paper explores the sociotechnical imaginaries emerging in processes of articulating technological measurements of "genetic truth" with multiple notions of protection of personal data.
Paper long abstract:
The simultaneous localisation and globalisation of 'terrorist threats' and cross-border criminality has rekindled the need to deepen cross-border police and judicial cooperation on the political agendas in the EU. In this scenario, the expansion of technological systems for large-scale exchange of DNA profiles plays a pivotal role. This paper takes as an empirical example the development and expansion of a technological system that imposes the automated exchange of DNA data between EU Member States. We explore the sociotechnical imaginaries emerging in processes of articulating technological measurements of "genetic truth" with multiple notions of protection of personal data.
Based on the analysis of a set of interviews conducted with professionals who are involved in a networked system of transnational exchange of DNA data for combating crime and terrorism, we address two main aspects: first, how these networks allow to account for both stability and instability in the technological measurement of "genetic truth". Second, we explore the societal and ethical implications of the transnational flows of DNA data. We conclude that the sociotechnical imaginaries played out amplify inscriptions of neutrality, mechanical objectivity and scientific authority, while, at the same time, diffuse responsibility and depoliticize power by making its actions opaque or invisible.
Paper short abstract:
A controversial technology for age assessment of asylum seekers was implemented in Sweden in 2017. By deconstructing the design, and combining insights from STS with studies of social constructions of children and age demarcations, the assumptions inscribed in the technology are shown.
Paper long abstract:
How to determine whether or not an unaccompanied asylum seeker´s claim about age is true or false has been a vivid topic of discussion in Sweden during the last years. When the request for medical age assessment increased and the Swedish Paediatric Society advised its members to not participate in the current methods, the Swedish Government instructed a government authority to develop a new technology for medical age assessment. Since the implementation of the new technology, in March 2017, the evidence-base of it has been widely debated. A controversy between different experts has been played out openly and the Swedish Migration Agency is under pressure to make efficient, yet evidence-based, decisions before the "actual children" turn 18.
In this presentation I will discuss desires, theories and beliefs that are inscribed in the design of this new technology for medical age assessment of asylum seekers in Sweden. By combining insights from STS with studies of social constructions of children and age demarcations, I will deconstruct the design of the technology through an analysis of official documents from the designers and video material from Sweden's largest, political venue where the designers presented the technology. Firstly, I will argue that there is a fundamental assumption that the chronological age of 18 is a fixed demarcation for whether or not an asylum seeker should be treated as a child. Secondly, neutrality and scientific authority plays an important role in the legitimation of this new technology through the language of quantification.
Paper short abstract:
This talk analyzes the socio-technical assemblage of Frontex Screenings, where so called "irregular migrants" with or without papers are identified, as a technology of sensing and sense making fabricating identities "true enough" and "processable" for further institutional sorting.
Paper long abstract:
In this talk I elaborate on truth assessment technologies understood as stabile assemblages consisting of techniques, materials and knowledge, which differ not only from the problems and practices of assessing claims as true or false, but also from the entities being brought together.
While DNA tests let probes speak for the person concerned to prove her claims by transferring the matter into the assemblage of a laboratory, police identification arrangements have to come to a conclusion in the very situation of determining a refugee's country of origin and the like whereat an individual is principally not to be trusted and capable of pretending, giving false statements and objections. While laboratories may derive measurements and estimations from objects as modest witnesses, police identification assemblages need to sense the individual's performance due to credibility and to make sense of her statements' content in terms of truth and lie.
I analyze the socio-technical arrangement of Frontex Screenings, where refugees with or without papers are identified, as a technology of sensing and sense making. Based on ethnographic fieldwork on Moria Hotspot I examine the practices of sensing and sense making by studying the interplay of skillful bodies, devices (e.g. screening booklets, questionnaires, Google maps) and taxonomies. I show, how this technology's design oscillates between framing and overflowing identifying credibility and truth successively by selecting, coding and channeling statements while monitoring the interviewee's performance. Finally I discuss pragmatics of truth assessment within the identification and registration procedure, which circles around "true" and "processable" identities.
Paper short abstract:
This paper explores how the polygraph machine is used in police interrogations in the United States. Drawing on a selection of interrogation transcripts, I show how examiners use a series of techniques to draw out confessions from suspects, and link this to the production of false confessions.
Paper long abstract:
Advocates of the polygraph machine, developed at the turn of the 20th century, and now widely used across the United States, have struggled to resolve contestation regarding the machine's validity and reliability. Nonetheless, police investigations routinely use the device to interrogate suspects, ostensibly to determine the truth of their statements. In seeking to explain the machine's continued use despite repeated scientific challenges to its authority, historians have emphasised the examiner's performance of a policeman-turned-scientist role in extracting confessions and its appearance in the popular imagination as the lie detector. In this paper I add to this picture by analysing interrogation transcripts to describe a series of techniques, which I term 'bleeding', 'clearing' and 'composition', used by examiners and police officers to draw out confessions from suspects. Exploring how these techniques work with regard to the socio-technical arrangement of the polygraph exam, I argue that such techniques regularly risk the production of false confessions, showing up a powerful ontological uncertainty in lie detection tests.
Paper short abstract:
Our target of interest is the research field of truth verification. We focus on the reciprocal influence of specific modes of seeing and corresponding sociotechnical interaction when certain visual methods such as eye tracking are used as a tool for truth detection.
Paper long abstract:
In Germany, the field of truth verification is divided into those who engage in the practice of credibility assessment, and those who engage in research on the characteristics of "deception" and the possibilities of its detection. The latter address this question within the broader research field practicing 'memory detection'. A similarity of this research and the practice is the collective vision of the need for an appropriate tool to verify truthfulness in measuring behavior or psychophysical reactions of a person. A host of contemporary procedures are in use like e.g. polygraph-tests, eye tracking, EEG or brain imaging methods.
In this talk, we explore the sociotechnical practice of visibilizing deception in the research field of truth verification. We focus on the reciprocal influence of specific modes of seeing and corresponding sociotechnical interaction that are at play when certain visual methods such as eye tracking are used as a tool for truth detection. On the basis of our laboratory research, we will provide insights on the rationales of visuality, where the reaction and behavior of the test subject are measured and assessed in respect to his/her "truthfulness". We outline the levels where the different aspects of seeing, visibilizing and using visual tools intersect in the use of truth verification technologies that measure psychophysiological responses to visual stimuli. By highlighting the dynamics of this intersection we discern the various desires and aspirations that are inscribed in these kinds of research applications and discuss the underlying concepts of deception and truthfulness.