Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality, and to see the links to virtual rooms.

Accepted Paper:

Whitey in the Cloud: How can anthropologists + technologists prevent human bias in AI programming resulting in a dystopian future for racialised communities?  
Toyin Agbetu (University College London)

Send message to Author

Paper short abstract:

Does mechanisation inhibit socially just behaviour? This paper suggests the quest for a transhuman future encourages the abdication of our collective moral responsibilities to AI. Without normalising ethical coding practices, processes of racialisation become embedded within technological solutions.

Paper long abstract:

In 2022, the computing power in our mobile devices are of several magnitudes faster and more powerful than the computer used in the Apollo 11, 1969 space mission. Yet despite the huge increases in memory storage and central processing clock speeds, the weakest component in the link thwarting a transhuman, digi-utopia are the biases of the humans that program code or "algorithms". However, as many technologists now envisage a future where decisions involving the welfare of lives, as in hospitals and transport, or the opposite, as in military drones and automated weaponry - what does this mean for people racialised as subhuman by virtue of not matching ‘default’ profiles? This paper asks anthropologists to explore our role in a world where social behaviour and culture are shaped by human-created algorithms embedded in almost every aspect of our digitally enabled lives. We fantasise about the growth of artificial intelligence in popular culture, but we are nowhere near the realisation of sentient bots. Nevertheless, what we do have are machines capable of moving data around at extraordinary speeds and distances, some for social purposes, others that are extractive. How do we, as anthropologists, prevent the proliferation of commercially exploitative, anti-social coding that in being designed to adapt to identify and highlight human preferences, often ends up replicating repellent human prejudices? Are our fears about the risks posed by Terminators as far removed from meaningful concerns about social injustice on Earth as when Gil Scott Heron documented “whitey’s” first walk on the moon?

Panel P06b
AI-assisted technology and the market: critical impacts on human societies
  Session 1 Thursday 9 June, 2022, -