Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.

Accepted Paper:

Image generation and the human–machine dialectic  
Andrew Lison (University at Buffalo, SUNY)

Send message to Author

Short abstract:

Debates over the impact of AI often focus on job losses it could precipitate. This paper considers the user practice of image generation intensification and the engineering concept of model collapse to argue that full automation is unlikely as labor will continue to meet otherwise-unmet human needs.

Long abstract:

The rapid rise of neural-network-based artificial intelligence has led to predictions that various forms of productive human activity will be overtaken by computation. This is especially the case for what autonomist theorists characterize as so-called “immaterial labor,” understood first and foremost as a kind of symbolic manipulation. AI, in its capacity to generate text, images, and sound, is seemingly ideally suited to such tasks. As sociologist Randall Collins has argued, prior to the contemporary resurgence of AI, automation was largely thought of in terms of “blue-collar” work; now, however, it has become a pressing occupational concern for the professional and managerial classes.

This paper considers the potential for automating semio-linguistic work through the lens of labor as a shifting social definition. In this view, labor shares something with the psychoanalytic definition of desire: once fulfilled, the need for it is not sated, but shifted elsewhere. An example of this can be found in the recent trend in which users post the results of their efforts to repeatedly make image generators intensify a conceptual aspect of an image they have been asked to depict (e.g. to make an image of someone at work depict them working harder). The similarity across such images as they approach extremes of intensity speaks to a limit in machine-generated visual vocabularies that reinforces the need for human contribution. Combined with the concept of “model collapse,” in which systems trained on AI-generated content produce increasingly uncreative results, a world without human work, “immaterial” or otherwise, appears unsustainable.

Traditional Open Panel P256
What is limiting artificial intelligence? STS perspectives on AI boundaries.
  Session 1 Tuesday 16 July, 2024, -