Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
Accepted Paper:
Short abstract:
This paper contributes to ongoing discussion on how solvable AI ethics interrelates with situated social work ethics as a fair algorithmic model is employed as support for voluntary counselling of vulnerable children in a Scandinavian NGO.
Long abstract:
Can we design a fair AI model that is precise enough to identify and draw distinctions in the causes of social problems, experienced by individuals? This is the question a Scandinavian NGO embarked to answer as they started collaborating with data-scientists from a tech firm to develop an algorithm to assist their voluntary counsellors in online communications with vulnerable children. However, what seemed to be fair in the hands of the developers, turned out to (sometimes) produce unfair outcomes in the hands of the voluntary counsellors. With the paper, we contribute to ongoing discussion on how practices develop as they are confronted with computational problem-solving (Lin & Jackson, 2023; Ruckenstein, 2023). Rather than judging what was wrong about the “fair algorithm”, we, in this paper, take it as an opportunity to investigate what happens, when data ethics and social work ethics interrelate in practices of employing AI tools for the good of society. Drawing on ethnographic fieldwork, we trace the ethical frictions produced by the algorithm, as it is translated (Latour, 2005) from being an ethically fair model that solves problematic issues with biased counselling and timeliness into being an ethically unfair model that hides away the importance of situated matters such as religion and the slowness of conversations. Somewhat to our surprise, the ethical frictions became constructive sites for advancing a new vocabulary for a relation ethics, through which the goodness of the fair model was continuously questioned and improved.
STS, AI Experiments, and the social good
Session 2 Thursday 18 July, 2024, -