Author:Samuel Woolley (University of Washington )
Paper short abstract:
This paper mobilizes STS theories of autonomy, affordance, and the actor in order to build understandings of the role of social bots in the 2016 US presidential elections.
Paper long abstract:
Bots are collections of code written to undertake automated online tasks that would otherwise be done by a human user. Social bots are bots that have frontline communication with human users online, often on sites like Facebook, Twitter and Reddit. Recently, a variety of political actors worldwide have begun using bots in attempts to manipulate public opinion. How might political bots be used to affect communication and behavior associated with elections, political crises and legislation? Three US based actor groups make up the heart of this multi-site, networked, research: political parties/campaigns, public commentators, and civil society groups. I am in the process of working with these groups to develop understandings of their use and interactions with political bots during the 2016 elections—grounded in conversations around events from the early candidate debates all the way to the final election. Data gathering mechanisms are multi-faceted in that they include field research methods of interview and participant observation with bot makers and tracks as well as analysis of large social media data sets. The developing method known as 'ethnography of information', here analyses of bots and algorithms as semi-autonomous entities, is presented as an arena for theoretical and methodological contribution and growth to STS. In this proposal, I demonstrate that political bots are among the exciting, and problematic, products of advances in both computation, automation, and political strategy. This paper analyzes this phenomenon and places political bots as a crucial element in conversations between several fields tied to STS.
Social Studies of Politics: Making Collectives By All Possible Means