Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
Accepted Paper:
Paper short abstract:
This work examines social media content moderation during 2021, focusing on pro-Palestinian activism for Sheikh Jarrah. Findings reveal perceived censorship due to opaque automated systems, complicating harm substantiation and redress. This raises concerns about power dynamics in digital spaces.
Paper long abstract:
Social media platforms, while influential tools for human rights activism, free speech, and mobilization, also bear the influence of corporate ownership and commercial interests. This dual character can lead to clashing interests in the operations of these platforms. This study centers on the May 2021 Sheikh Jarrah events in East Jerusalem, a focal point in the Israeli-Palestinian conflict that garnered global attention. During this period, Palestinian activists and their allies observed and encountered a notable increase in automated content moderation actions, like shadow banning and content removal. We surveyed 201 users who faced content moderation and conducted 12 interviews with political influencers to assess the impact of these practices on activism. Our analysis centers on automated content moderation and transparency, investigating how users and activists perceive the content moderation systems employed by social media platforms, and their opacity. Findings reveal perceived censorship by pro-Palestinian activists due to opaque and obfuscated technological mechanisms of content demotion, complicating harm substantiation and lack of redress mechanisms. We view this difficulty as part of algorithmic harms, in the realm of automated content moderation. This dynamic has far-reaching implications for activism’s future and it raises questions about power centralization in digital spaces.
Live from the frontlines: the mediation of armed conflict through online platforms
Session 1 Wednesday 17 July, 2024, -