Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenor:
-
Kathleen Richardson
(De Montfort University)
Send message to Convenor
- Format:
- Plenary
- Start time:
- 9 June, 2022 at
Time zone: Europe/London
- Session slots:
- 1
Short Abstract:
In her pioneering ethnographic work among AI engineers and developers in the late 1980s and early ‘90s, Diana E. Forsythe explored – in terms that might today be considered moderate – the hidden cultural assumptions and gender relations in the field. “Both female bodies and experience,” she wrote, “tend to be treated as inherently anomalous in the male world of high technology”. This anomalous status renders women’s bodies as “hyper-visible” and their experience as invisible to their male colleagues, and, by extension, to the emerging AI systems.
Long Abstract:
Although there has been a great deal of discussion concerning the biases built into algorithms and other AI programmes in the intervening decades, recent headlines suggest little has changed. ‘Is AI sexist?’ asked a Foreign Policy article in 2017... ‘AI can be sexist and racist – it's time to make it fair’, said a 2018 paper in Nature... and, “Only 22 percent of professionals in AI and data science fields are women,” points out a 2021 paper in Stanford Social Innovation Review entitled ‘When good algorithms go sexist’.
It would be comforting to argue that this shows an increase in awareness, and that action is being, or at least will be, taken to build ‘gender-smart’ AI. But it is just as likely to be the case that things are getting worse. AI has advanced much faster than all the efforts to tackle its racist and sexist biases. It is now present in a vast range of everyday interactions, from setting credit scores and running traffic systems to perhaps less well-advertised applications in areas such as sex robots and lethal autonomous weapons systems. Moreover, despite the oft-stated intentions of tech companies, high-profile controversies such as the ousting of Timnit Gebru from her role at Google are cause for serious concern.
In this workshop, participants will be asked to confront the possibility that the “inherently anomalous” treatment of women in AI is indicative of misogynistic structures in its development and use. Steps to address the problem, such as feminist data practices, advocacy for gender-equitable AI and encouraging more women into AI careers, may offer hope – but are they enough? And if not, what are the likely consequences?
Accepted papers:
Session 1Paper short abstract:
This talk explores how AI is built upon ways of understanding the world, interpreting and analysing data, and formalising intelligence which can produce and reinforce dehumanising attitudes, and may in particular reproduce misogyny in the form of a reductive and instrumental view of women. The design and use of AI including social media can exacerbate this tendency to misogyny, which spills over into real world practices which reduce women to component parts to be manipulated by technoscience.
Paper short abstract:
Alan Turing, intent on birthing his "child machine," objected to Ada Lovelace’s assertion that the Analytical Engine could not "originate anything." He proposed that Lovelace had likely never considered the possibilities of speed and storage when she noted the limits of computation. As the daughter of a poet, however, Lovelace more likely recognized the limits of rule-bound machines and valued a decentralized intelligence rooted in imagination, creativity, and language.
Paper short abstract:
In her classic 1986 text The Creation of Patriarchy, Gerda Lerner argued that Friedrich Engels had gotten the reason for the “world historical defeat of the female sex” wrong.
Paper long abstract:
The motor force was not an increase in productive capacity that made men want to pass a newly-invented store of material goods only to their biological heirs, motivating their sexual monopolization of particular women. Lerner argued that the key moment was the turning by men of women into reproductive factories who churned out slaves. Slavery itself was the key innovation: a new way of organizing human activity that produced not just food surpluses but enabled agricultural and construction projects (of irrigation networks, of early cities) on a hitherto impossible scale. If you had at your disposal an enormous class of people you could force to work harder, and longer, and on fewer calories than ever previously envisioned, why, what couldn’t you accomplish? The patriarchal family was organized not to produce a few heirs (these were a mere byproduct of the new order) but to produce many slaves. Now little read, Lerner’s text offers rich insights for a feminist analysis of contemporary technology. Silicon Valley, AI, and transhumanist projects fashion themselves as daringly forward-looking and put considerable energy into futurist aesthetics depicting luminous circuitry-laden humanoids posed against backdrops of unexplored deep space. Aesthetics aside, they offer hitherto unimaginable levels of labor exploitation and of the erasure of not just women as reproducers of that labor but of women as existing at all. Anthropologists, with their deep empirical knowledge of the full human story, ought more readily to recognize the hoary old patriarchal beast under the flexi-fungible cyborg fancy dress.