Click the star to add/remove an item to/from your individual schedule.
You need to be logged in to avail of this functionality.
Log in
- Convenors:
-
Margo Bernelin
(Laboratoire Droit et changement social, University of Nantes)
Sonia Desmoulin (CNRS)
Send message to Convenors
- Chair:
-
Sonia Desmoulin
(CNRS)
- Format:
- Traditional Open Panel
- Location:
- NU-3A47
- Sessions:
- Wednesday 17 July, -
Time zone: Europe/Amsterdam
Short Abstract:
Privacy-Enhancing Technologies are offering exciting technical solutions to privacy concerns for data processing. This panel seeks to explore their agency and the reconfigurations they prompt, and to assess in turn the appropriateness and adequacy of current or planned regulatory transformations.
Long Abstract:
Privacy-Enhancing Technologies (PETs) are changing the scenery of data protection and privacy. By aiming to tackle security safeguards and, correlatively, a hight level of confidentiality, PETs seem to fit perfectly our current European legal landscape that requires to adopt technical solutions that protect personal data. From data obfuscation technics, to encryption and distributed ways to access data, PETs are fostering not only scientific but also regulatory enthusiasms. Offering an apparent turnkey solution for preserving private life when personal data are being processed, PETs are taking the fore front of privacy researches. They even tend to shift away privacy discussions from personal rights realisation (such as the rights of access, the right to object to processing or to correct data) to technical upstream measures. In this regard, only few researches have explored the underlying assumptions of that shift and its potential impact.
Building on studies on the governance of science and technologies, our panel seeks fill this gap by questioning this shift and more precisely PET’s impact on agency. Various questions can be addressed: Are personal rights on data processing still relevant? Can PETs foster them or do they reconfigure their effectiveness? Are PETs taking over current regulatory stand on data processing such as the European Health Data Space? What reconfiguration are at stake? Should regulatory answers to privacy concerns be transformed as well by taking into account PETs as object-agents ?
This panel seeks to fit in this year conference’s theme by bringing reflection on the transformative potential of SSH and STS research on the regulatory agenda over data protection.
We seek to bring together scholars from various STS perspectives, inviting theoretical and conceptual studies as well as empirical analysis and comparative approaches. Our panel welcomes proposals from young-researchers (PhD student included).
Accepted papers:
Session 1 Wednesday 17 July, 2024, -Paper short abstract:
With our contribution we address the reconfiguration of the political economy of privacy rights, the common good and financial gain through Privacy-Enhancing Technologies (PETs). To this end, we reflect upon the example of COVID-19 heat maps that utilises the technical possibilities of PETs.
Paper long abstract:
We present research on a particular application of Privacy-Enhancing Technologies (PETs) utilised for public health purposes. Novel cryptographic methods like homomorphic encryption allow the joint analysis of discrete data sets without having to share their concrete content. In the case at hand, PETs allow the privacy-preserving creation of a COVID-19 heat map through the combination of medical records (infection data) with mobile phone data (geospatial data). During the evaluation the individual datapoints stay encrypted and none of the two parties is able to recover the input from the other. Despite this type of data protection, it remains possible to perform mathematical operations on the encrypted data leading to insights on an aggregated level. In other words, while the information about all individuals remains protected, information on the population level can be obtained. Such possibilities hold great potentials for public policy making as the case of COVID-19 heat maps demonstrates.
With regard to the theme of the panel, we ask how such technical solutions lead to a reconfiguration of the European data protection paradigm. Are PETs making the exploitation of personal date possible, that would otherwise be restricted? How do these new possibilities spill over to other domains such as the sector of commerce? Does privacy protection using PETs no longer prevent the exploitation of consumer data without a solution for fair benefit sharing? Ultimately, we address the question as to whether PETs reconfigure the political economy of privacy rights, the common good and financial gain.
Paper short abstract:
The use of PETs is critical to preserve data subjects’ privacy when processing and sharing data. At the same time, it can make the exercise of data subjects’ rights difficult. There is thus a paradox as more privacy through technical means can lead to less control of data subjects on their data.
Paper long abstract:
The exercise of data subjects’ rights, as provided for in Article 15 to 20 of the GDPR has been a cornerstone for data protection law as it established a right to informational self-determination, in line with the fundamental right to privacy. Another great achievement of the GDPR was to promote privacy by design and default as well as data security from a technical perspective. The GDPR thus sets a coherent series of obligations since it not only empowers data subjects with regard to the control of their data but it also encourages the technical protection of data. From this background, the use of Privacy Enhancing Technologies is critical to preserve data subjects’ privacy when collecting, storing or sharing data. At the same time, the use of such technologies can make the exercise of data subjects’ rights complex by making the identification of requesting data subjects harder. This results in a paradoxical situation where enhancing privacy through the use of PETs can undermine privacy by preventing data subjects from exercising the rights they hold on their data. These technical measures might even be implemented on purpose by data controllers to circumvent the application of the GDPR. This contribution aims to explore this paradox and see how EU data protection authorities as well as Member States’ courts have dealt with such a paradox so far.
Paper short abstract:
At the frontiers of Data protection, Synthetic data offers new technical perspectives for reducing privacy risks when sensitive personal data is being processed. However this PET's agency can be questioned. Indeed, will SD, make obsolete various privacy requirements or prompt their reconfiguration?
Paper long abstract:
Synthetic Data (SD) is a term that is making headlines. From applying to credit data to perform analysis to health data for research purposes, SD offers new technical perspectives for reducing privacy risks when sensitive personal data is being processed. Indeed, SD refers to the artificial reproduction of personal data without the identifying components. In this regard, SD artificially reproduce the properties and content of real data sets to create relevant and useful anonymous data. This technical feat of strength was even the focus of one of the IPEN’s seminar in 2021 (IPEN - Internet Privacy Engineering Network, a body created by the European Data Protection Supervisor to promote privacy enhancing technologies).
This PET is set to be a significant tool to allow the circulation of anonymous sensitive data in order to boost research, transparency and efficiency. Indeed, data protection regulations are supposed to be respected and even foster by SD.
The present research will address this dreamlike technical solution for data protection by questioning its pretensions with regard to various legal requirements. In this regards, we seek to answer this question: will SD make obsolete various privacy requirements or prompt their reconfiguration? Taking for example the health sector, we will demonstrate, that the shift toward SD data will have significant impact for various regulations without answering all privacy concerns.
Paper short abstract:
In an oral history of NYC cryptoparties, the ecology of activist spaces and information institutions that supported these events are revealed. Cryptoparty activists evangelized public-key cryptography—a tool for communal trust in programmer communities—which limited the reach of the events.
Paper long abstract:
Hackers, activists, and journalists have been throwing cryptoparties—crowdsourced skillshares for anti-surveillance and digital privacy tools—for over a decade. But the dream of cryptoparties creating “encryption for the masses” has passed, as behavioral surveillance has become more entrenched in capitalism. Through an oral history of the cryptoparty movement in New York City, cryptoparties are revealed to have been sustained through an ecology of activist spaces, open source culture, and information institutions like libraries. Cryptoparty organizers struggled to use technical expertise and vocational skill to turn narrow tools of encryption and obfuscation, created by free and open source software developers, in service of broader community need. For the free and open source software programmers and cypherpunks in the movement, encryption technology like Pretty Good Privacy (PGP) is the key into a world of community and participation through code. But when they evangelized the use of public-key cryptography tools, all they could offer most members of the public was the key to an empty room. The role of PGP as a source of communal trust in hacker and infosec communities is examined, to show how its use in the social reproduction of those communities created alienating requirements for activists and privacy-conscious library patrons alike and ultimately limited the reach of cryptoparties.