Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenor:
-
Cari Hyde-Vaamonde
(King’s College London)
Send message to Convenor
- Format:
- Panel
- Sessions:
- Monday 6 June, -
Time zone: Europe/London
Short Abstract:
Involving AI in justice decisions represents a serious challenge, which will impact society as a whole. This panel includes voices from law, technology and the arts in an attempt to break down barriers between disciplines, focusing on the impact of AI on the way in which people experience justice.
Long Abstract:
Laws can seem impersonal, designed to judge on the facts rather than the individual. If true, is judging mechanical - something that can be automated? When AI is used to look at how justice is done by humans, it throws up challenging examples of bias and inconsistency, set against a backdrop of increasing delays in justice. Perhaps it would be wrong not to bring in the machines? To what extent can algorithms help, or is there an ethical barrier to involving them where human life is at stake. In examining these questions the focus cannot just be on the risk of algorithmic bias or the technical accuracy of the algorithms themselves. Already, the impact of AI on law and legal decisions is far reaching, leading to deep institutional effects, affecting the personal experience of justice. In an area prone to polarisation, we gather together a panel with diverse skills to explore the nuanced issues in relation to the integration of AI to law and justice. How helpful is human discretion to justice? Can a common language to be developed to realign the aims of the state, the technical expert, the individual and society?
In 2020 a distressed defendant's sobs were simply 'muted' in an online hearing by a UK judge, while in Singapore a man was sentenced to death over a Zoom call (Gibbs 2020; Ratcliffe, 2020). Is anything less than in-person justice, in-human?
Accepted papers:
Session 1 Monday 6 June, 2022, -Paper short abstract:
As a visual artist working with emerging technology I'm interested in how we engage with algorithms in our everyday lives, the benefits and the pitfalls.
Paper long abstract:
There is a mystery surrounding the algorithms that we engage with in on a day-to-day basis, from social media to the judicial system. How can the mechanics of these systems be revealed through gaming technology and video art works?
Paper short abstract:
If a driver swerves to miss a pedestrian, who is responsible if the computer control of the cornering radius does not allow the vehicle to use the utmost potential of the tyre traction? What legislation governs the control system? Can grace, anthropology and engineering mesh to manage these issues?
Paper long abstract:
James Clerk Maxwell died in 1879. In his 48 years he uncovered the phenomenon of electromagnetism and set in place the mathematics of closed loop feedback control systems. Ninety years later, engineering students fed punch-cards coded with “if statements and conditional do loops” into enormous computing machines out of which came lists with data segregated into categories. That was not machine learning. An intelligent machine, senses data and segregates it in a manner not predetermined by the programmer, but correlated within the machine itself with the most significant variables being sensed. Thus a vehicle cornering radius may come to be adjusted not according to a fixed algorithm but by an algorithm developed within the vehicle’s own computer. Such implications find resonance in the work of Dame Mary Douglas on risk, danger, blame and culture, Edwards Deming’s concept of profound knowledge and Daniel Kahneman’s revelation of human overeagerness to simplify without understanding. Regulations surrounding machine learning must heed erstwhile Chief Justice of England Lord Hewart's warning of “The New Despotism” in which the executive holds the legislature in its thrall, arranges the passing of the Acts it administers thereby shielding itself from the judiciary. Technical complexity makes this all too easy and the legislature all too gullible as guidelines and standards morph into regulations that become treated as immutable maxims. This paper explores Hewart’s despotism as a core anthropological concept and the form of grace required to hold on track Edmund Leach’s and Anthony Giddens' runaway worlds.
Paper short abstract:
Pressure on courts means reform is inevitable, but if new technology is introduced without both a common language and shared goals between tech and other disciplines, the public's sense of legitimacy in the system is at risk. This paper explores the impact, and how collaboration might be achieved.
Paper long abstract:
Many legal systems are on the brink of a crisis of confidence. Pressure on courts is unprecedented, and delays mean that justice is too often not being served.
AI/algorithmic tools exist that may assist in improving processes, and legal frameworks to allow these methods are being put in place, but public trust in human-computer interaction in this context is under-explored. Arguments regarding bias, or technical metrics of accuracy only go so far in addressing the issues. The legitimacy of the system itself is at risk of being undermined if reforms take place without public confidence being maintained.
Exploring how methods from the social sciences can be used to build mathematically-based models in a collaborative way, this paper focuses on perceptions of legitimacy, acknowledging that there are serious questions regarding the legitimacy of human-only judicial decision-making that cannot be ignored.
Can we develop a common language that realigns the objectives of state, implementer, public and participant towards congruence and enables an informed dialogue?