Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Annalisa Pelizza
(University of Bologna and University of Aarhus)
Francesca Musiani (CNRS - Centre national de la recherche scientifique)
Stefania Milan (University of Amsterdam)
Send message to Convenors
- Stream:
- Tracks
- :
- 114
- Sessions:
- Thursday 1 September, -, -, -, Friday 2 September, -, -
Time zone: Europe/Madrid
Short Abstract:
Governance by technology", "governance by design", "politics of technology", "de-facto governance": this panel investigates similar conceptualizations as an attempt to frame the need to uncover regimes of inclusion/exclusion nested in technical details of information infrastructures.
Long Abstract:
Governance by technology", "governance by design", "politics of technology", "de-facto governance": these conceptualizations constitute an attempt to frame the need to uncover regimes of inclusion/exclusion nested in technical details.
From Lessig's formulation of "code as law" to DeNardis' "protocol politics", from Bowker and Star's "infrastructural inversion" to Galloways' protocol-based "virtual bureaucracies", information infrastructures have been recognized as privileged locus to make the material embeddedness of political arrangements visible. Yet the governing materiality of information infrastructure is often bypassed in recent accounts about "information flows", the anti-hermeneutics of "big data", and the possibilities of "cloud computing".
This track has two main goals. Firstly, it aims to further the field of governance by information technologies by providing a compendium of digital artefacts that subtend to the creation of new actors. How do algorithms contribute to change governance dynamics? How do web services help stabilize emerging actors? How do internet protocols represent de-facto governance?
Secondly, the track seeks to investigate how the notion of "governance" can tackle current developments in the fields of cloud computing, cybersecurity and internet governance/policymaking. How can the governance of innovation and technological change be apprehended? Is "governance" a trait proper to a given artifact in its stabilized form, or does it evolve along with it? To what extent can we enlarge the notion of governance to uses, practices, informal communities, without it becoming too much of an all-encompassing notion? How is the public/private dichotomy reconfigured as a result of these developments?
SESSIONS: 4/5/5/5/4
Accepted papers:
Session 1 Thursday 1 September, 2016, -Paper short abstract:
The paper aims to account for actual forms of “algorhythmic governance” in cities looking at cases and practices of configuring, deploying and retrieving data from sensing devices for sound and air quality monitoring in Dublin.
Paper long abstract:
The paper aims to account for actual forms of "algorhythmic governance" in cities, intended as the way of shaping urban temporality through digital infrastructures to order urban life. Looking at cases and practices of configuring, deploying and retrieving data from sensing devices for sound and air quality monitoring in Dublin, the study will explore how the rhythm of the city is regulated and tuned in order to enact specific forms of governance. In particular, the attention will be directed to the frequency rate of data capture as a crucial aspect in making sensing devices accountable for urban management: on the one hand, producing and maintaining constant the heartbeat of the city allows to generate predictable models for managing urban settings and act upon them; on the other hand, however, setting the frequency and the right measure requires continuous adjustments and balances depending on the historical and situated dimension of city life, related for example to mutable mobility and planning aspects. In order to be effective, governance needs to combine different rhythms given the interconnected and multifarious kind of rhythms and measures. Nonetheless, setting the rhythm makes important distinction between what is noise and what is signal, what is relevant for governance and what is not, what can be predictable and included and what can not. In emphasizing the role of rhythms in urban governance, the study intends to critically address the debate on anticipatory governance and speculative design considering the multiple, coexisting and conflicting space-time dimensions of the city.
Paper short abstract:
Pursuing ‘total information awareness,’ state security agencies have given meta-data a prominent and problematic role. Drawing on Snowden documents, we find that the de facto blurring between meta-data and content calls into question the innocence of meta-data as a form of sociotechnical governance.
Paper long abstract:
This paper considers government surveillance as a site of science and technology by other means. Pursuing the allure of 'total information awareness,' state security agencies within the so-called Five Eyes countries have turned to 'big data' techniques. As one consequence, meta-data is given a newly prominent and problematic role. Meta-data's advantages over more conventional reliance on message 'content' requiring human analysis, particularly its amenability to automated algorithmic analysis, lower legal barriers to its exploitation.
Drawing primarily on the 500+ published documents released by Edward Snowden, this paper examines the various interpretations that Five Eyes agencies give to meta-data, focusing on the socio-material aspects of communication interception, algorithmic analysis and decision-making. We show that meta-data includes far more than merely the 'outside of the envelope' information often claimed in media reporting. When considered en masse, meta-data can be at least as revealing of sensitive personal attributes as the content of communications. Under conditions of claimed security threat and institutional over-reach, such detailed analysis fuels dangerous illusions of reliable knowledge of an individuals' beliefs, affiliations and intentions, as reflected in US security agency chiefs proudly claiming in 2014, "we kill people based on metadata."
This blurring of meta-data and content data also undermines the traditional legal distinction between them upon which privacy norms are based. A significant governance implication is that in the context of massive meta-data analysis, this distinction is anachronistic, and that meta-data derived from personal communication activities should be granted the same data protection rights accorded to 'personal information.'
Paper short abstract:
The protocol Transport Layer Security (TLS) enables secure communication on the Internet. It uses an arrangement of certificate authorities to validate client/server communication. This talk focuses on how TLS is used in practice and carves out the protocols materiality.
Paper long abstract:
The SSL/TLS protocol plays an important role in the Internets security arrangement, which is an important part of Internet governance (DeNardis 2014). Internet users rely on this protocol everyday for e-commerce, E-mail, data transfer etc. In practically any case in which a secure and reliable client/server connection is needed SSL/TLS is employed nowadays. This becomes more and more important as the demand for secure data transfer is on the rise.
SSL/TLS uses certificates based on asymmetric encryption and the concept of public key infrastructures (PKI) to validate connections and communication partners. In order to make this process actually work in practice a rather complex arrangement of (semi-)private and state actors who act as certificate authorities and vouch for the identity of server/websites on one hand and users (browser-vendors, app-producers, E-Mail providers etc.) who decide to accept certificates or not on the other hand has emerged. At its core this arrangement organizes trust based on the materiality of the protocol.
The talk uses the concept of materiality to carve out governance processes here in two ways. Firstly how the materiality of the protocol sets the space in which actors appropriate SSL/TLS while the protocol itself is defined by technological possibilities and constraints inscribed via standardization documents (i.e. RFCs) and technological concepts (i.e. PKI). Secondly how the arrangement of certificate authorities and organized users in practice could be understood as an information infrastructure materializing and therefore making visible the governance of secure Internet connections on the web.
Paper short abstract:
Using insights from technoscience and software studies, I want to show how the underlying logic of data mining algorithms and databases foster the production of possible future targets for data-driven drone warfare.
Paper long abstract:
Targeting individuals has become increasingly institutionalized in the US global war on terror. Based on big data analytics and NoSQl databases, targets are selected and kill lists such as the 'disposition matrix' are produced. I seek here to develop a material perspective focusing on the neglected non-human world of data mining algorithms and databases which increasingly standardize and sort our world. Using insights from technoscience and software studies, I will show how the underlying logic of data mining algorithms and databases foster the production of ever more data and possible future targets for a data-driven killing apparatus. In this process human and non-human decision making processes are intimately intertwined by which the messy targeting process gets even more opaque and less traceable. This is not least a result of a technorationality with an open-ended search heuristics based on automated and systematic tinkering through which narrative and causality is substituted with (cor)relation and recombination - a logic which advances a possibilistic, preemptive culture of technosecurity.
Paper short abstract:
In my presentation I examine the newness, or distinctive traits, of online-based collectives as an interplay between technological infrastructures and social processes of coordination and institutionalization.
Paper long abstract:
In my presentation I examine the newness, or distinctive traits, of online-based collectives-masses, crowds, communities of interest, and social movements-, which I identify as being the strong and hitherto non-existent interplay between the technological infrastructures that these collectives are embedded in and the social processes of coordination and institutionalization they must engage in in order to maintain their viability over time. As inadequate as it may be to conceptualize and analyze such web-based collective formations exclusively with social categories, it would be just as problematic to aggrandize technology or technical infrastructures into being the main and overriding factors of collective behavior and actions on the Internet. This is because the very technological foundations in which collective actions take place reveal themselves to be genuine social processes—be it as new general offers and infrastructures developed by the leading Internet companies or as independently-operated platforms that are created and further developed in the context of communities or social movements. None of these web platforms on which people communicate, organize, work and mobilize is merely a technological offer that users can utilize or redefine as they please. Instead, social structuring patterns are already embedded in the platform technologies themselves. All technical specifications—not only those of commercial corporations but also those created by communities or movements—have rules, standards and action guidelines incorporated into them that influence the group's activities in a manner similar to social institutions and that (co)structure the actions of their users in often very rigid ways.
Paper short abstract:
By addressing infrastructure as a form of discourse, we analyse the ideological construction of privacy and anonymity in the interface and infrastructure of the I2P network andexplore how the technical activities of hacker communities challenge dominant narratives regarding online privacy issues.
Paper long abstract:
This paper is interested in the ideological implications of technological resistance to online surveillance. It draws on software and surveillance studies literature that puts light on the globalized and ubiquitous state of surveillance (Andrejevic, 2013) permitted by the Internet and digital technologies, and on the ways in which the development of their architecture and design played a crucial role in the normalization of surveillance practices operated by corporate and state actors (Ball & al., 2012). We understand the infrastructure of the Internet and digital technologies as a central component to a governance apparatus in which the protection of privacy is in conflict with with the idea that the transparency of individuals is a vector of better public security and increased market efficiency. In this perspective, we are interested in how recursive publics (Kelty 2005, 2009) make sense of the current politics of infrastructure and, more importantly, how they contest them through technology. More specifically, and through the case study of the I2P network, our main objective is to explore how alternative and encrypted networks provide a direct form of resistance to online surveillance, and how they potentially promote alternative narratives regarding the broader notions of privacy and anonymity. By addressing I2P's infrastructure and interface as a form of discourse (Galloway 2004), we aim to study its underlying ideological construction of privacy and anonymity in order to see how it challenges current liberal narratives where privacy and security are tethered as a binary opposition between individual liberty and public interest.
Paper short abstract:
In this paper, I draw upon data I gathered during ethnographic research to explain the involvement of corporate actors in popular FOSS projects. I argue that high knowledge requirements create barriers that substitute ownership and that allow governance practiced by a narrow group of actors.
Paper long abstract:
The Free and Open Source Software (FOSS) movement can be characterized as revolving around the central value associating source code with freedoms related to speech rather than property. The movement is seemingly at odds with the current modes of capitalist production. However, we can observe intensive involvement of corporate actors in popular FOSS projects. This paper aims to explain the coexistence of such contradictory tendencies.
I draw upon data I gathered during ethnogrpahic research in a FOSS project and attempt to foreground the backstage elements of work practice - to make an infrastructural inversion. I describe programming as a practice that, through compiled programs, assembles and delegates action in a durable form to multiple places - user's computers. In this process, software tools serve to translate unpredictable flows of work into standardized units, delegate them to public places and make them connectable - easily includable into other compositions.
While such infrastructure greatly reduces the transaction costs of software development, it also places high knowledge requirements on potential contributors. Thus, although the licensing typical for FOSS projects systematically suspends the rights traditionally associated with ownership, the rights are actually practiced by a narrow group of actors who hold specific types of knowledge. As a result, there seems to be a close relationship between ownership and knowledge. Such form of "practical ownership" allows the most involved actors to steer the direction of software development - to practice governance of a FOSS project to the extent that it allows them to generate profit.
Paper short abstract:
Wikipedians rely on software agents to govern the ‘anyone can edit’ encyclopedia project, in the absence of more formal and traditional organizational structures. Lessons from Wikipedia’s bots speak to debates about how algorithms are being delegated governance work in sites of cultural production.
Paper long abstract:
I present findings from a multi-year ethnographic study of automated software agents in Wikipedia. "Bots" and bot developers are a core part of the volunteer community that curates one of the world's largest and most popular information resources. The Wikipedian community relies on hundreds of independently run bots to monitor and regulate almost all aspects of the site. Bots are delegated a wide variety of organizational and administrative work, including: patrolling for spam, 'vandalism', and 'edit wars'; standardizing grammar, layout, citations, and units; updating articles using public datasets; and identifying more complicated work and distributing those tasks to humans.
In my infrastructural inversion (Bowker & Star 1999), I argue Wikipedia can only appear to be governed by an economistic "wisdom of crowds" if the work delegated to bots remains invisible. These bots have long been a core way in which Wikipedians govern the 'anyone can edit' project in the absence of more formal organizational structures. Wikipedians also work out fundamental disagreements about what the encyclopedia and the community ought to look like by, in part, debating about how bots ought to be delegated governance work. For example, one of the more consistently raised (and rejected) proposals on the English Wikipedia is a bot that would make all articles conform to a single national variety of English.
Lessons from Wikipedia's bots speak to many debates about how algorithmic agents are being incorporated into sites of cultural production, drawing our focus to the governance work that is delegated to automated information infrastructures.
Paper short abstract:
We discuss how the design of Internet technology related to the way members of VOCI (a voluntary community in Syria) shared ownership during a situation of conflict, and to the way they blocked out each other from online spaces. To explore an alternative, we engage in a design controversy with VOCI.
Paper long abstract:
In our attempt to uncover some of the implications of Internet technology on governance, we focus on the local. We study how the Internet and social platforms relate to the way a local community of volunteers (VOCI) in Syria established an environment to plan and conduct their activities, as well as to govern how the community evolved. When a conflict erupted within the community in late 2012, the online tools that VOCI members used became in the core of the struggle over access and control among the members involved. We show that the design of basic Internet protocols, as well as tools operating on it, is strongly related to the way VOCI members shared ownership during conflict, and to the way they blocked out each other from online spaces that belonged to the community. Based on these observations and the STS tradition of studying controversies, we actively engage in "mapping the issues" (Marres 2015) and in a design controversy with VOCI members to design Modus; an online tool for governing shared ownership.
Paper short abstract:
This paper examines the politics behind Internet governance of definitions and standards in the European Union context. It focuses on the conflicts between European legislation and private companies when trying to regulate behaviours online, while looking at spam and web-cookies as a case study.
Paper long abstract:
This paper examines the politics of categorizing and regulating behaviours on the Internet while focusing on the European Union. Specifically, it examines how unwanted forms of communications are categorized as spam while others as cookies, and what are the consequences of such a process. This research was conducted by undertaking policy analysis of European legislation related to spam and cookies, mainly of the European Commission, the Organization for Economic Co-operation and Development (OECD), the Article 29 Data Protection Working Party, as well as analysing technical documents of the Internet standards organisation the Internet Engineering Task Force (IETF) and the World Wide Web Consortium (W3C). This paper contributes to STS literature by arguing that ambiguity and lack of decisive definition in the legal discourse allows for power relations to be constructed and performed on the Internet. Importantly, governance in the European Union is enabled by the delegation of power to private companies under the self-regulation approach. Such strategies facilitate the institutionalization of the European Union e-commerce, and point to the influence of technology's materiality on the way people understand and engage with the Internet.
Paper short abstract:
The use of information technologies in the food-energy-water nexus is growing with uncertain implications for governance. In this paper we draw upon governance capabilities scholarship to better conceptualise the role of information infrastructures and governance in the food-energy-water nexus.
Paper long abstract:
The food-energy-water (FEW) nexus presents us with a 'wicked' problem similar to climate change adaptation. The nexus cannot be precisely formulated or solved due to widely divergent understandings of the FEW nexus as well as inherent complexities associated with their monitoring and evaluation. At the same we are witnessing a proliferation of information technologies (ITs) in FEW nexus governance. Examples include, the Internet and mobile phones as a means of communicating agricultural market information to farmers or as tools for monitoring water and sanitation services. Many of these "small infrastructures" have taken place with strong public private sector partnerships and involve large investments. It remains uncertain however how these ITs will become (or are already) embedded in increasingly messy and complex FEW governance arenas. Drawing upon scholarship on governance capabilities we explore new theoretical avenues with the aim to better understand this entanglement of information infrastructures with nexus governance. We further draw upon a literature survey that has helped us identify IT cases and has helped us contextualise our analysis of the main five governance capabilities of reflexivity, resilience, responsiveness, revitalisation and rescaling. A conceptual framework linking ITs with FEW governance is proposed that takes much better into account the rapidly evolving nature of information infrastructures.
Paper short abstract:
This paper traces the regulatory processes around the emerging technological zone of autonomous driving, looking at how law and technology are co-producing each other.
Paper long abstract:
Autonomous driving is an emerging technology currently in an phase of experimental testing. Contrary to theories of law "lagging behind" technology, regulatory bodies are already active in shaping the legal and infrastructural environment of autonomous driving before the technology is mature. Since autonomous driving is a technology that is dependent on sensing and acting in an external environment, it is dependent of a stabilizing regulatory framework to fully materialize as a technology.
This paper puts the notion of "code-is-law" in perspective by tracing the regulatory processes around the emerging technological zone of autonomous driving. It argues that legal texts, regulatory bodies, stakeholder networks, and both existing and expected developments in hardware and software are actors in a negotiation of power over the stabilization of autonomous driving as a socio-technical phenomena, including not only the autonomous vehicle itself but surrounding technologies and infrastructures, as well as social norms and regulatory frameworks.
Paper short abstract:
The paper focuses on the role of standardization processes within Chinese environmental disclosure in shaping the "fragmented datascape" of environmental governance, and software objects (databases, websites, algorithms, apps) designed to leverage it to foster environmental transformations.
Paper long abstract:
A central role in the shaping of environmental policy in China is played by the uneven interests of central and local institutions. At the level of environmental information disclosure (on the background of the general drive towards institutional transparency epitomized by the 2009 Open Government Information Measures), this has made database design, sorting and querying algorithms and interface design, along with material infrastructure (e.g. pollution sensors, cabling) and legal provisions, controversial objects with multiple local interpretations and implementations, resulting in an "environmental datascape" (Lippert, 2015) characterized by high fragmentation, wherein various actors (human and nonhuman) leverage different parts.
This presentation focuses on the role played by environmental and informational standardization processes, and on how institutional arrangements contribute to the fragmentation. In particular, we will show how the misalignment between national and local standards problematizes comparisons across time (tracking performances) and space (comparing cities/provinces).
Within this fragmented data space, governance efforts enacted by governmental and especially non-governmental actors to foster environmental change require constant alignment practices and complex data governance efforts to entail their own supplementary standardization. This process includes testing and certification measures, which are then inscribed into software objects (websites, databases, scraping algorithms, mobile applications), thus assuming a degree of institutionalization (especially at the supra-national level) which leads to conflict with their official counterparts. While this situation complicates some environmental governance practices and policies, it is also functional to the enactment of a specific institutional arrangement constraining the autonomy of local institutions vis-à-vis central ones.
Paper short abstract:
This paper compares current initiatives to build digital infrastructure for the humanities in Europe and the US. Conceptually, I propose that infrastructure functions as a regulatory technology, i.e. an interface through which the actor groups in a science system rearticulate their mutual relations.
Paper long abstract:
In this paper I provide a comparative perspective on current initiatives to build digital infrastructure for the humanities in Europe and the US. Thereby I mean to move beyond analyzing the shaping of technology within individual projects and instead trace in a more encompassing way how dominant research policies mediate the reorganization of disciplinary tool development. Drawing conceptual inspiration from the work of Sheila Jasanoff, I propose that infrastructure actually functions as a regulatory technology, i.e. as an interface through which the different actor groups in a public science system rearticulate their mutual relations. US digital scholars, I argue, have successfully promoted a sociotechnical view of infrastructure as an emergent, evolutionary phenomenon, according to which conceptual and managerial authority should be situated at well-established digital humanities centers. While avoiding problems related to the implementation of technology in more traditional scholarly practices, this arrangement will tend to privilege the intellectual and technological preferences of existing elites within digital humanities over those of other research communities. European initiatives by contrast are based on a more centralizing, technology-driven vision of digital infrastructure that serves the European Commission's policy goal of integrating national research systems. This causes a certain disconnect between tool developers and prospective scholarly users who are often unfamiliar with digital approaches, but the emphasis on central coordination also ensures that no single community gains exclusive control over technology development.
Paper short abstract:
Drawing upon recent work on standards, information infrastructures and technoscientific imaginaries, this paper examines the rise of the Global GSM standard, and examines the complex politics of governing mobile standards.
Paper long abstract:
"Mobile devices" describes a dynamic category of technological artifacts. From the iPhone to internet-enabled medical devices that are capable of sending data packets over LTE networks, this particular category of devices is readily proliferating. If one accepts the prediction of technologists, the internet-of-things will encompass 20 billion "things" by 2020. These 20 billion smartwatches, bridges, refrigerators, homes, sensors and phones will communicate via mobile infrastructures whose structure, rules, permissions and restrictions are currently being designed by technocratic Telecommunications Standards making bodies that are far from the purview of traditional processes of democratic engagement. What is the socio-cultural history of these telecommunication standards and how did they achieve global consensus? Where are the politics of Mobile standards development, and what are the forms and shape of its governance mechanisms? What are the roles of the State, the Market, transnational firms and regular citizens in the development of these standards? Drawing upon recent work in STS on technological standards, the governance of information infrastructures and the idea of the technoscientific imaginary, this paper examines the history of the GSM standard, its birth within the nascent European Union, its acceptance as a global standard, and examines the complex politics of governing mobile standards.
Paper short abstract:
This paper explores the political and technical dimensions of users’ temporal experience online, which sets the metronome for contemporary life. I highlight the findings mixed methods study to uncover how time is considered in history of the development of Name Data Networking (NDN).
Paper long abstract:
In the popular imagination, virtual environs made possible by contemporary networked information and communication technologies (ICTs) save time by freeing us from the physical realm with its impossible commutes, risky associations and bulky objects. However, STS' interrogation of informational and computational infrastructure reveals that information is thoroughly material. As contemporary life is increasingly synchronized with ICTs, it is important to dissect the relation between the materiality of information and how ICT infrastructures are designed to process this information efficiently and seamlessly. These design practices produce particular temporal experiences for users, resulting, for example, in the common impulse for instantaneous fulfilment of desire, as well as the urge to be continuously available. I draw from information studies, philosophy, media studies and computer science to uncover how the values in network design and conceptions of temporal experience are built into ICTs.
I look to the ICT infrastructure to understand how the design and implementation of these technologies make assumptions about time and users' attention as resources to be managed and, in some cases, exploited. I highlight findings of a qualitative study of design document analysis and interviews with designers to uncover how latency is managed in the design of Name Data Networking (NDN), a new protocol for file and data transfer over the internet. These study results show how time is used as a resource in networked computation, highlighting the political, epistemological and technical dimensions of temporal experience online, which sets the metronome for contemporary life.
Paper short abstract:
Designers and developers in software testing are not conventional users. However, they spend many time using infrastructures, essential for their work. The paper analyzes connections between the testing of a designed application of a social network architecture, and two testing infrastructures.
Paper long abstract:
Following designers and developers in their practice means to observe a long series of activities which are focused in remote infrastructures. These infrastructures are employed to test new devices, but they are also challenged in their fonctioning, often fixed, or reloaded, and opened to developers' interventions. In this sense, designers and developers act as users. This contribution arised from a research carried out in an Italian ICT company, into a group of designers and developers. They aimed to extend an extant architecture of a social network toward a remote surveillance device, understood as a new application of the old social network. In order to test their device, technicians used to monitor the tests' evolution employing two different remote devices. The first one was used to assess the flow of data exchanged during the attempt to make a video call, while the second one gave more detailed information about the operating commands and the parts of the protocol used during the attempt to exchange a video call. Despite the conventional and taken-for-granted accounts, the two remote devices worked as infrastructures, without which all the testing activity was unthinkable. This situation challenges not only the traditional divide between designers and users, but also opens to considerations on spacial dimensions not strictly defined by a set of use, due to remote and digital characters of this kind of infrastructure. Furthermore, the "origination" of digital devices emerges as a strongly infrastructured work, where infrastructuring challenges "creation" or planned activities as the main activity.
Paper short abstract:
This paper contrasts managerial expectations and actual use of intranet software in the context of a company’s merger. While company management aims at installing a unified corporate culture, the intranet runs contrary these attempts and enables employees to maintain pre-merger and departmental divisions.
Paper long abstract:
Since the 1990s intranets have enjoyed popularity within organisations as tools fostering work collaboration and organisational change, for instance by increasing information transfer between employees. Drawing on interviews and observations, this paper first illustrates managerial expectations towards the intranet in the context of a company's merger. Secondly, it sheds light on how the intranet is enacted as part of distinct work practices in different departments of the company. While managerial efforts are directed towards promoting a coherent corporate culture on the intranet, as part of various work practices only specific applications are relevant to employees accomplishing their everyday work. Thereby, employees work around and overlook managerial endeavours.
With reference to insights gained in STS and theories of practices, the paper illuminates how the intranet entangles with existing logics and divisions in the company (such as hierarchies and organisational sub-cultures, specific knowledge and work routines) and thereby questions the steering managerial framework. As will become apparent, while part of various work practices, the intranet moves beyond what I call the managerial "politics of wholeness" and related efforts of creating a specific corporate culture and staff members. In doing so, the intranet does not fulfil managerial expectations but nevertheless ensures the continuance of work. Furthermore, it enables employees to escape the logic of a 'unified' company and to partly uphold extant, 'home-grown' divisions. As such, the paper contributes to understanding organisational change and resistance in relation to the sociotechnical infrastructure of the intranet.
Paper short abstract:
Our empirical research on projects of distributed software development focusses on software development methodology and on how the software tools and organization techniques it provides structure and govern the communication, coordination and collaboration in distributed teams.
Paper long abstract:
We present results from case studies on projects of software development in which part of the work is nearshored to Eastern Europe. In all our cases, the coordination of the transnationally distributed team is governed by the software development methodology Scrum. Scrum provides organizational techniques as well as software technology for governing software development. Or research investigates the role of governance by technology and by organizational techniques in distributed teams. There are several methodologies for structuring, organizing, managing, and governing software development. Scrum is the methodology mostly used today in small and medium sized development teams. The Scrum methodology includes guidelines defining a set of roles with their respective rights and duties (such as "Product Owner or "Scrum Master"). It specifies how the work should be organized (in "Sprints") and coordinated (for example by "Daily Scrums"). And it describes certain Scrum Artifacts, that is, technological tools (such as the "Product Backlog"), which are representing the technological counterpart to the organizational structure of the Scrum process model. Though Scrum came into life as an organizational project management technique, meanwhile many of its components have been complemented by software tools, or even turned into software. For investigating how governance by technology works and how it interacts with organizational techniques and informal practices, distributed software development is a particularly suitable field of study. This is because the spatial distance requires technologically mediated communication and coordination and, consequently, invites attempts to automatize information flows and to regulate coordination technologically.
Paper short abstract:
A „Doing Governance“ approach allows us to take the practices into account, in which norms and their materializations are constructed. But this research is in need of proper analytical instruments. Our framework, which understands Governance as an achievement of figurations, can provide them.
Paper long abstract:
Concepts of governance had been applied primarily to institutional structures, normative factors like law, technology and social norms and to their materializations in written law, contracts or code. However, if we do not see them just as given artifacts, but also as a common construction of social reality, we can also shed light on uses and practices from a governance perspective. That's the "Doing Governance"-approach.
Unfortunately, theoretical concepts and methods needed for comprehensive analyses covering structures and processes are still missing. We propose to connect these by understanding governance as an achievement of figurations in terms of Norbert Elias. Such figurations show determinable features: Individual and collective actors form specific constellations. Power, privileges and responsibilities of the actors are corresponding with these. Furthermore, they realize specific communicative practices in determinable relevance frames. Looking at governance on the basis of this framework opens twofold methodological access: First, we can do hermeneutic content analyses of the materializations, and the normative structures thereby. Second, we can observe the figurations and analyze their features and communicative practices.
We illustrate that this framework is useful on the basis of the case of Governance of conflicts in search engine entries after the ECJ's decision on Google Spain. This forces companies such as Google, to set up their own procedures, rules and departments to handle deletion requests by users. We examine this change in governance of the use of search engines and show at the same time, how helpful the proposed framework is for understanding such transformations.
Paper short abstract:
This paper analyzes the idea and concept of ”Finnish elite culture” as it is expected to emerge in the recently launched interdisciplinary research project ”The consortium Computational History and the Transformation of Public Discourse in Finland, 1640–1910".
Paper long abstract:
This paper analyzes the idea and concept of "Finnish elite culture" as it is expected to emerge in the recently launched interdisciplinary research project COMHIS, that is "The consortium Computational His¬tory and the Transformation of Pub¬lic Discourse in Finland, 1640-1910". By utilizing library catalogue metadata and full textmining of all the digitized Finnish newspapers and journals published before 1910 (roughly two million pages), the COMHIS project is able to achieve groundbreaking, qualitatively new kind of understanding of how e.g. language barriers between Finnish and Swedish interacted in the period, how elites became mingled in popular debate, and how "domestication" of transnational influences into the Finnish public discourse took place.
Membership of an elite conventionally indicates inhabiting an exceptionally favourable position in some significant relationships of power, and/or strongly advantageous access to economic, cultural or social capital. In this paper we discuss how choices concerning techniques of digitalized textmining and methods of abstraction and visualization in getting command of the mining's results are best put into dialogue with a) the evolving and refined conceptualization of key changes in the Finnish elite culture through centuries, b) nature and scope of various power positions/powerful actors that get enacted in the research material, and c) possibilities to trace "roads invisible and roads not taken". The last item means reading the research material as space of potential realities or possible worlds; tracking, for example, such important historical turning points where things arguably could have turned otherwise than they actually did.
Paper short abstract:
This paper conceptualizes traveling technologies through an ethnographic case study of Rwanda's health infrastructure. I argue that by tracing the transfer of new technologies the shifts of existing global political, economic and societal orders can be described and analyzed in novel ways.
Paper long abstract:
The last decades witnessed a global proliferation of medical data in order to plan and organize health care. As part of this process, decision making in health and therapeutic interventions are becoming data problems, where health statistics, standardized protocols or measuring impacts become major concerns. Moreover, the emerging data infrastructures connect a larger therapeutic apparatus - global health - to medical practices and the organization of health to so called low resource settings. The paper traces this process by providing an ethnographic case study on the use of information and communication technologies (i.e. cell phones, software, digital lists) to enhance maternal child-health in a rural health sector in Rwanda. I show that this health development initiative is not just a direct translation of the Millennium Development Goals but also mirrors the huge expectations that appear around new digital technologies for development in Africa. To understand this, the paper analytically focuses on how (new) relations between people, things and ideas are constituted and institutionalized when a technology is being transferred. As a promising analytic concept Traveling Technologies are conceptualized for tracing biomedical technologies in different global and local contexts. This enables to ask wide-reaching questions on the ways knowledge and society are relocated in global orders and networks of exchange. The conceptualization of traveling technologies also implies that something needs to be creatively adapted when being transferred. This processes of transfer and contextual reconnection of new technologies helps to reveal shifts of existing global political, economic and societal orders.
Paper short abstract:
This paper takes stock of the work presented in track 001 and on the authors’ own fieldwork to provide a reflection on governance *by* information infrastructure: its perimeters, definitions, potential as an analytical tool.
Paper long abstract:
The proliferation of conceptualizations such as "governance by technology", "governance by design", "politics of technology", "de facto governance" - grounded in STS-based studies of infrastructures as loci of distributed and invisible power - reflect a double, and increasingly pressing, need. On one hand, in the era of so-called "algorithmic governance", we ought to keep uncovering the regimes of inclusion and exclusion nested in technology. On the other hand, there is a need for the materiality of information infrastructures to be reaffirmed, made explicit, and untangled at a time in which discourses proliferate on information flows, loci and geographies (e.g., clouds), and on the prominence and mythologies of data.
This paper will take stock of the work presented within the track "Materializing governance by information technology", and of its convenors' respective ongoing fieldworks on the "Vectorial Glance" framework highlighting the performative and identity-building character of information infrastructures (Pelizza), on the imaginaries subtending the development of the internet logical infrastructure as well as data infrastructure (Milan), and privacy protection in decentralized, end-to-end encrypted messaging software (Musiani).
The paper ultimately aims to advance reflection on the relation between the concept of "governance" and digital artefacts: its perimeters, definitions, potential as an analytical tool.