Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
Accepted Paper:
Paper short abstract:
This paper argues for a decolonial perspective on content moderation labor involved in flagging, reviewing, downranking, quarantining and removing the ‘scum’ in the digital pipeline – the vast amounts of violent, toxic and extreme content that circulate online.
Paper long abstract:
This paper argues for a decolonial perspective on content moderation labor involved in flagging, reviewing, downranking, quarantining and removing the ‘scum’ in the digital pipeline – the vast amounts of violent, toxic and extreme content that circulate online. An influential strand of scholarship has examined how the deployment of artificial intelligence and automation can not only address the problem of volume of online content, match the hectic pace of digital exchange and reduce costs for companies, but it can decrease human discretion and emotional labor in the removal of objectionable content. This techno-optimistic nod to the daunting labor of content moderation however elides the vast global disparities in content moderation, complexity of language and audiovisuality in online exchange, and the process of iteration involving community annotators. Building on interactions with factcheckers during the course of the project, “AI4Dignity” (2021-2022), which is building a collaborative process model for extreme speech detection, this paper shows how content moderation labor is not only devalorized within the hierarchies of global corporations but the touted technological remedies, appeals to “crowd sourcing” and third-party content moderation arrangements reproduce colonial extractive logics of digital capitalism.
Digital media, work and inequalities [Media Anthropology Network]
Session 1 Wednesday 27 July, 2022, -