Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Yathukulan Yogarajah
(University College London)
Harry Rodgers (University of Leicester)
Toby Austin Locke (University College London)
Elena Liber (University College London)
Send message to Convenors
- Formats:
- Panel
- Mode:
- Face-to-face
- Location:
- Facultat de Filologia Aula 2.3
- Sessions:
- Tuesday 23 July, -, -
Time zone: Europe/Madrid
Short Abstract:
This panel will explore how social media algorithms are shaping the world around us, social relationships, and understanding of ourselves. What can an anthropological focus on social media algorithms say about what it means to be human in the digital age?
Long Abstract:
The most recent iterations of social media platforms, including TikTok, Kuaishou, Douyin, among others, wield unprecedented influence. With an estimate of 800 million people using TikTok, with many spending several hours a day entranced on TikTok and other social media apps, it is not so surprising that they have been described as shaping politics, our selves, economics, social relationships, and more generally, how we think. Many social media users, media scholars, and journalists have identified ‘the algorithm’ as playing a critical role in this. Intriguingly, social media users are even articulating an interpersonal connection with ‘the algorithm’ that governs their online experience.
While numerous scholars have undertaken efforts to comprehend the intricate ways in which social media algorithms shape our world, a critical gap in this research, potentially stemming from methodological challenges, lies in the absence of an ethnographic approach. Against this backdrop, this panel invites papers that theoretically, ethnographically, and/or methodologically explore how social media algorithms are shaping how we form social relations, how politics and economics is enacted, and how we come to understand the world around us.
Through ethnographic exploration we will delve into the profoundly anthropological question of what it means to be human in an era where individuals are increasingly describing themselves as in a relationship with ‘their algorithm’.
Accepted papers:
Session 1 Tuesday 23 July, 2024, -Paper short abstract:
This study investigates how embodied experience is shaping and shaped by audiovisual encounters resulting from collaborating with social media feeds. We inquire how filters remediate the representation of bodies and embodied experience by affording new ways of seeing and relating to one’s own body.
Paper long abstract:
Social media platforms, such as Instagram and TikTok, provide users with a variety of filters that allow them to modify their appearance, thus reinforcing and perpetuating idealized notions of beauty. This research investigates how embodied experience is shaping and shaped by (audio-)visual encounters resulting from collaborating with algorithmic systems. Employing a collaborative ethnographic approach, we inquire how filters remediate the representation of bodies and embodied experience by affording new ways of seeing and relating to one’s own body. In our ongoing project we collaborate with student assistants as both researchers and participants, thus experimenting with (auto-)ethnographic methods, such as participatory observation, keeping and analyzing field notes as well as media diaries. The collaboration with student assistants enables us to better understand how ever-evolving networks of digital practices and algorithms co-curate image feeds and produce intersubjectivity. A collaborative approach adapts to the highly dynamic and personalized experience of social media, leverages the students’ familiarity with digital practices as social media’s key audience, and facilitates an environment of continuous learning with and from each other. By exploring the intricate entanglement of embodied experience, representation of bodies, and algorithmic systems this study aims to provide insights into how these elements coalesce to shape notions of beauty, offering a valuable contribution to anthropological discourses about how technology and AI-based algorithms affect everyday lives.
Paper short abstract:
The paper explores the algorithmic imaginaries employed by two different groups of platform workers (content creators and delivery workers) and explores the strategies for negotiating with the algorithm in their everyday working lives.
Paper long abstract:
Platforms are ubiquitous – we work on and through platforms, communicate via platforms, we also shop, order food or find therapists via various platforms. The term ”platform economy“ thus serves as an umbrella term for various types of labour facilitated via these platforms (Berg, 2018). Increasingly these platforms are algorithmically ordered, which puts the workers in a position where they have to negotiate their working conditions directly with or via the algorithms, leading to phenomena such as algorithmic precarity (Chan, 2022) or algorithmic insecurity (Wood & Lehdonvirta, 2021).
The proposed paper is based on 18 months long ethnographic fieldwork including participant observation and semi-structured interviews with two different types of platform workers – content creators on social media and drivers and delivery workers. While both these groups are in their everyday working lives dependent on algorithms, they are rarely compared or studied together. In my research, however, I have found that the experiences of content creators and drivers and delivery people with algorithmically mediated work often overlap. Specifically, I have found three distinctive ways to negotiate with the algorithm: 1) learning about the algorithm via interacting with it, which includes different tests and strategies to find out what the algorithm will do in different situations, 2) trying self-optimize discipline oneself in order to be understood by the algorithm and 3) engaging in cross-platform labour in order to “stay ahead“ of the algorithm. Theoretically, the paper draws on the concept of algorithmic imaginaries (Bucher, 2016).
Paper short abstract:
Drawing on ethnographic fieldwork with ADHD communities and anthropological theories of algorithms as traps this paper will explore the transformative self-making processes which emerge as people come to identify with ADHD through content recommendation algorithms.
Paper long abstract:
TikTok has been identified as playing a significant role in the increased prevalence of ADHD with #ADHD being one of the most popular mental health related hashtags on the platform (Zenone et al. 2021). Public and clinical health literature has considered this growth in self-identification with ADHD through the lens of ‘cyberchondria’ – the development of symptoms or health anxiety through consumption of digital content. In this paper I will make use of anthropological theories of content recommendation algorithms as traps (Seaver 2019), and ethnographies of trapping (Willerslev 2017), in conversation with original ethnographic material to examine self-identification with ADHD as a psychosocial infrastructuring and ethical project which unfolds through entanglement with digital technologies. Content recommendation driven social media here forms a component of a broader environmental or infrastructural trap, creating a particular ecology of attention within which the psychiatric category of ADHD has widespread resonance. Through initial captivation in digital environmental traps, people come to identify with ADHD as a transformative ethical project of self-making which allows them to make sense of and exert agency over their worlds with and through entanglement with content recommendation algorithms. ADHD here takes the position of a transfomative ethical project and technique of self entangled with digital technologies and content recommendation algorithms.
Paper short abstract:
From physical recording equipment to online algorithms, the technologies involved in ASMR content can be found to co-constitute an affectual, intimate and distinctly queer relationship with the humans who both produce and consume it
Paper long abstract:
In the production and consumption of ASMR content online, the chain of relationships between content creator, recording equipment, algorithm and viewer could be considered as linear, or one-directional. This understanding, however, relies on a conception of technology as merely an instrument, utilised by and separate to human bodies. In this paper, I demonstrate that ASMR technologies are instead actively involved in both affectual and emotional formations of relationality.
Auditory, visual and haptic stimuli within ASMR videos provoke a sensation named by the community as ‘tingles’. This paper proposes these ‘tingles’ to be an affectual impression on the body which orientates those involved towards pleasure and connectivity. In this paper the term ‘bodies’ is taken in its broadest possible sense to include the algorithmic and digital technologies used to promote closeness. By considering the algorithm as itself a ‘body’, with the ability to both affect and be affected, one can destabilise conceptions of who can contribute towards sociality and embodied pleasure.
In looking to the emotional closeness and trust which govern users' exchanges with the algorithm, I establish that each relationship between human and technology within the ASMR community is intimate and interdependent. In doing so, I bring attention to and challenge the discursive and symbolic boundaries which project polarities between humans and machines. In this article I construct a reparative analysis of human relationships with technology, in an attempt to make imaginable collective and queer formations of intimacy which exist beyond offline closeness, touch and heteronormative monogamy.
Paper short abstract:
Algorithms subtly shape the way we interact and form relationships in different digital environments. To understand how these black boxes influence our social interactions, we must first consider how they shape the understanding and presenting of our mediated selves.
Paper long abstract:
Mobile dating apps (MDAs) are smartphone applications that bring people together for romantic, sexual, and friendly meetups. With their primary focus on dating and intimate connections, MDAs constitute a fascinating window on discourses around gender, sexuality, and intimacy. The different categorization features within these apps serve as clues that users utilize to shape and understand their own identities, as well as to present themselves to other daters. At the same time, these features serve as data collection strategies for mobile dating platforms, which use, analyze, and sell this data for, on one hand, platform functioning, on the other hand, corporate interests. In the complex interplay between platforms-users-algorithms, gender and sexuality data play a central role in the recommendation of profiles to users. However, the inner workings of these algorithms are often shrouded in secrecy, making it challenging to understand how they influence users’ experience. The paper examines how mobile dating apps (MDAs) construct gender and sexual identity for a user registering a new profile, and discusses implications around gender, sexuality, and intimacy, drawing from research using the walkthrough method, an ethnographic approach to the architecture and mechanisms of digital environments, and a diary study with Italians of diverse gender and sexual identities.
Paper short abstract:
This paper will explore from a multi-species perspectives people's engagement with their social media algorithm
Paper long abstract:
There is an important kind of relation emerging across the globe that has received scant attention – a relationship with ‘the algorithm’. This is a relationship not between content viewed through social media, Netflix, online platforms and the human viewer. It is a relationship between the human user and what they perceive to be the forces that organise the content that appears on their screens – what my interlocuters identify as ‘their algorithm’ or simply ‘the algorithm’. This paper thinks through this relationship through engaging with literature on encounter and multispecies relations. This paper sets up, and makes use of, a series of encounter to initiate analytical movements, methodologies, and patterns of thought aimed at opening-up new ways of thinking about social media in the age of personalised algorithms.
Paper short abstract:
The impact of social media on children has proven a challenge for management in schools and at home in the family. This paper shows how rural local communities in the southwest of Ireland strategized to mitigate the negative impacts caused by social media use amongst their children.
Paper long abstract:
The impact of social media on children has proved a challenge for management in schools and at home in the family. This paper shows how local communities strategized to mitigate the negative impacts caused by social media use amongst their children. The strategy centers around a small primary school and its success widely reported in the national media yet the mainstream news did not tell the full story. The affordances provided by social media were locally used to develop a community anew in order to mitigate the ill-effects on the upcoming generation thereby mitigating – without demonizing – the effects of sociotechnical processes. This case is useful in showcasing how ethnographic work can illuminate the impact of the algorithm in an indirect fashion, casting light on the negative image generated by obfuscated online actions, themselves influenced by unknown algorithms directing human attention and influencing social relations.
Paper short abstract:
Drawing on long-term ethnographic fieldwork with anti-government activists in Sri Lanka, the "Folklore Affordance Framework" demonstrates that to get closer to better comprehending social media algorithms it is time to see their technological affordances as a repertoire of algorithmic folklore.
Paper long abstract:
For fifteen years, the correlation between a social media’s design architecture and the social outcomes they offer has been framed in the language of affordances (boyd, 2010; Hopkins, 2019), and platforms are broadly recognised as affording outcomes relational to user subjectivity (Miller & Madianou, 2013; Costa, 2018). However, affordance studies have yet to make sense of how algorithms shape the social world because they are notoriously opaque, raising questions about how to approach them ethnographically.
In this article, I argue that to get closer to better comprehending social media algorithms it is time to see their technological affordances as a repertoire of algorithmic folklore (Bishop, 2019). Algorithmic folklore are the “beliefs and narratives [about social media algorithms] that are passed on informally and can exist in tension with official accounts” (Savolainen, 2022, 1092). Ethnographically, I draw on my long-term fieldwork with anti-government activists in Sri Lanka, and the tensions that emerge around how activists try to “game the algorithm” and maximise their visibility, whilst maintaining their credibility and safety.
I introduce the Folklore Affordance Framework to identify the functionalities of discrete platform design features, and to recognise how affordances operate in (and out of) practice via a collection of folkloric responses to the opaqueness of platform operations. The paper will demonstrate how folkloric affordances are not only a useful heuristic device for analysing the gaps between what platforms’ afford and what users think platforms’ afford, but how affordance folklore materialises as actionable outcomes against “black-box gaslighting” (Cotter, 2023).
Paper short abstract:
This paper investigates how social media platforms’ post FOSTA/SESTA moderation and regulation of sexual content impacts users’ online and offline behaviours, with a special focus on sex workers and pole dancers.
Paper long abstract:
Media platforms encourage sharing practices and create an alternative economy where creators can profit off selling services as much as access to their private lives. Following the 2018 FOSTA/SESTA bill, platforms have implemented new, and more more pervasive algorithmic surveillance mechanisms and automated moderation systems that allow to flag, invisibilize, and remove content. However, because algorithms are not trained to understand context, platforms often issue warnings about content that does not actually violate their Terms of Service while failing to moderate dangerous material or abusive user interactions. Under the guise of complying with legislation and keeping communities safe, platform governance has become even more opaque, leaving many users to feel powerless against the constant threat of silencing and deplatforming.
In my work, I explore how within the post-FOSTA/SESTA digital landscape, platforms’ attempt to curb sexual content has radically changed the ways in which people interact with social media. Specifically, considering how platforms’ crusades against nudity and sexual activity targets both sex workers and pole dancers in the same way, I look at how these groups merge algorithmic folklore and human creativity to test, implement, and globalise new media strategies to adapt their lives to algorithms (and survive them). Drawing from Foucault (1988), I define these as the Technologies of the FOSTA/SESTA self, a complex mix of self-censorship, code-cracking, and semantic shifts that offer the promise to improve online visibility, escape algorithmic surveillance, and obtain digital immortality.