Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Simon Egbert
(University of Bielefeld)
Matthias Leese (ETH Zurich)
Send message to Convenors
- Stream:
- Encounters between people, things and environments
- Location:
- Welcome Centre Lecture Theatre 2
- Start time:
- 28 July, 2018 at
Time zone: Europe/London
- Session slots:
- 1
Short Abstract:
Predictive policing, the algorithmic construction of crime risk areas, has to be grasped as a socio-technical process. This implicates to carefully analyse the development and legitimisation processes of such technologies, as well as the practical effects of their utilisation.
Long Abstract:
Throughout the world, police departments have started to implement predictive policing software in order to generate geospatial crime predictions. Algorithmically calculated "risk areas" thereby indicate spaces where crimes are estimated to happen with increased likelihood. This process of risk computation is notably a socio-technical one, as the predictions are products of algorithmic analyses of crime data that result in geospatial visualisations which have to be interpreted and acted upon by police officers on the street level.
Conceptualising the use of police prediction software as socio-technical environments of interaction, we aim to specify the concrete distribution of agency between human operators and technological tools when it comes to the creation of crime risk. This means to carefully analyse the development processes of the prediction technologies that are used, and to highlight the discourses and expectations that are tied to these innovations. How are implementation processes or pilot runs legitimised? What role do the developers and distributors of technologies play in these contexts?
It also implicates to analyse the practical effects of the sociotechnical interaction of predictive policing, notably the changing ways of policing in certain risk areas and the potential implications for police law and criminal justice. Who is controlled in such areas by the police, how is suspicion created, and what role does the prediction technology play in this process? In general terms: How is police work modified by the utilisation of prediction software? What epistemic and practical effects can be analysed?
Accepted papers:
Session 1Paper short abstract:
This paper empirically analyzes police strategies to keep predictive policing software in check: (1) human oversight; (2) strengthening human reasoning vis-à-vis the machine; (3) invoking data quality and data protection; and (4) contextualization within larger trajectories of police work.
Paper long abstract:
Predictive policing - broadly speaking the claimed ability to forecast where and when the next crime or series of crimes will take place through algorithmically supported analysis of live crime data - has been one of the most pertinent and readily implemented new security technologies in recent years, and has sparked wide debates about algorithmic agency, decision-making, and repercussions for social justice.
Based on field research within multiple German and Swiss police agencies, this paper engages the ways in which the police react to these debates, and try to keep predictive policing software "in check. The analysis identified four distinct strategies that speak to concerns about possible negative implications from data-driven crime predictions: (1) human oversight; (2) strengthening human reasoning vis-à-vis the machine; (3) invoking data quality and data protection; and (4) contextualization within larger trajectories of police work.
Together, these strategies of institutional implementation and practice arguably facilitate the acceptance of new high-tech security tools and allows police agencies as well as politicians to render predictive policing (morally) legitimate in public discourse. On a broader level, the "human in the loop" here stands emblematic for larger policy-making issues around new security technologies and their institutional implementation - ranging from automation and cognitive extension up to potential full-scale automation of security tasks.
Paper short abstract:
Predictive policing programs pose evaluators with the paradoxical task of proving that police intervention prevented predicted crimes. This paper highlights contrasting epistemologies in research, police and industry for deciding whether a strategy is effective and their role in its diffusion.
Paper long abstract:
How do you know that something has not happened? Predictive hotspot policing programs pose evaluators with the paradoxical task of proving that predicted crimes would have happened yet were prevented by specific police intervention . Drawing on empirical research including in-depth interviews with police analysts and software developers, along with observations from policing conferences, this paper highlights the contrasting epistemologies at play in predictive policing strategies. Based on preliminary research findings this paper identifies three distinct modes of knowing at work: scientific evaluation based on controlled trials, software companies' hit rates and police anecdotes. The paper assesses what role these epistemologies play in the diffusion of predictive policing technologies. Given the paradoxical nature of such evaluations, the paper further analyses how rationales for implementation continually shift from effectiveness to convenience.
Paper short abstract:
In recent years, the collaboration between Swedish police and business analytics company Qlik resulted in the production of the system STATUS at national level. Although the use of Qlikview is not limited to crime analytics, it is predictive analytics that mainly become a matter of public debate.
Paper long abstract:
In recent years, the collaboration between Swedish police and business analytics company Qlik resulted in the production of the measurement and follow-up system STATUS at national level. The environment provided by the use of Qikview platform is appropriate for supporting the decision-making process within the organization at various levels: preventive, operational, administrative, financial, etc. Although its use is not limited to crime analytics, it is predictive analytics what mainly attracts the concern of the press and is a matter of public debate.
The critical point in this story called 'predictive policing' is not crime 'prediction' or 'forecast' itself but all those ideals and the processes that prepare and build prediction and make it essential to validate the positive circuit "data-hypothesis-theorem-experiment-verification-induction-hypothesis" within the context of police science. Therefore, the ideals of scientism and technocracy that are embedded in these technologies and the ways in which the Security Forces (eg. police) form the image of effectiveness and neutrality based on these ideals are crucial. Moreover, in addition to Security Forces role to safeguard Law and Order, it is also of great importance to remain safe, away from criticism during the exercise of legal force. Secrets are the privilege to govern; science and technology are full of such.
For the purposes of the research, Swedish police officers, platform designers and programmers of the Qlikview platform were interviewed and a qualitative study was conducted on a Qlikview clone with encrypted data.