to star items.

Accepted Paper

STS and Social Design Perspectives on Digital Content Moderation - Countering Hostility by Design   
Anna Antonakis (Bern University for Applied Sciences)

Send message to Author

Paper short abstract

Scholarship has shown how profit-driven algorithms amplify anger and division, as these emotions drive engagement and affect on social media platforms. The paper explores ways to counter this hostility by design through different kind of content moderation, from regulation to digital literacy.

Paper long abstract

The design of systems that structure information and communication infrastructure can be considered a major challenge of the 21st century. Platform designs—interfaces, recommendation and ranking mechanisms, policies, content moderation mechanisms, and links to paid partnerships—are not neutral but manifest platform politics and economies. A growing body of literature from science and technology studies, feminist and free speech advocates, activists and practitioners help to map the field of industrial content moderation systems, composed by Algorithmic moderation Systems (AMS) and human moderators. We are only beginning to grasp (Algorithmic) Moderation System's impact on conflict dynamics, social cohesion and democratic practices, but also on human moderator's mental health. Noble's groundbreaking analysis in "Algorithms of Oppression" demonstrates bias in technology and states that "Racism and sexism are part of the architecture of the language of technology" (Noble 2018). More authors are focusing on the reproduction and deepening of racist, classist, and sexist power relations through platform design and Algorithmic Moderation Systems expressed in concepts such as „Automating Inequality“(Eubanks; 2017), Platformed Racism (Matamoros Fernandez, 2017) and „Algorithmic misogynoir“ (Marshall, 2021) citing Moya Bailey). At the same time, studies on hostile working conditions for data workers and human moderators are growing.

Bringing into dialogue feminist science and technology studies, social design and content moderation studies, the paper explores ways to counter this hostility by design through different approaches in content moderation. I hereby distinguish internal (platform policies), external (national and supranational policy frameworks) and cultural/educational approaches.

Traditional Open Panel P261
Hostility by design?
  Session 1