FPF at CPDP 2022: Panels and Side Events
As the annual Computers, Privacy and Data Protection (CPDP) conference took place in Brussels between May 23 and 25, several Future of Privacy Forum (FPF) staff took part in different panels and events organized by FPF or other organizations before and during the conference. In this blogpost, we provide an overview of such events, with a particular focus on the panel which FPF hosted on May 24 at CPDP, on the topic of Mobility Data Sharing under the upcoming EU Data Act: what are the data protection implications and how should the risks be mitigated?
All the below sessions were recorded by the CPDP organizers, and we will include a link to the recordings as soon as they are made available.
May 20: ADM Report Launch Event – A Discussion with Experts
On May 17, FPF launched a comprehensive Report analyzing case-law under the General Data Protection Regulation (GDPR) applied to real-life cases involving Automated Decision-Making (ADM). The Report, authored by FPF’s Policy Counsel, Sebastião Barros Vale, and FPF’s Vice President for Global Privacy, Gabriela Zanfir-Fortuna, is informed by extensive research covering more than 70 Court judgments, decisions from Data Protection Authorities (DPAs), specific Guidance and other policy documents issued by regulators.
On May 20, the authors discussed with prominent European data protection experts some of the most impactful decisions analyzed in the Report during an FPF roundtable. Speakers included Gianclaudio Malgieri, Co-director of the Brussels Privacy Hub and Associate Professor of Law at EDHEC Business School (Lille), Mireille Hildebrandt, Research Professor on ‘Interfacing Law and Technology’ at Vrije Universiteit Brussels, and Brendan van Alsenoy, Deputy Head of Unit “Policy and Consultation” at the European Data Protection Supervisor (EDPS). The expert roundtable discussion was enriched by representatives from UK’s Department for Digital, Culture, Media and Sport (DCMS), the European Consumer Organization (BEUC), and the Brussels Privacy Hub. Watch a recording of the conversation here, and download the slides here.
May 22: CPDP Opening Night – Vulnerable Data Subjects
The day before the conference program started, Gabriela Zanfir-Fortuna was part of a stellar panel organized by the Brussels Privacy Hub for the Opening Night, on the topic of “Vulnerable Individuals in the Age of Artificial Intelligence (AI) Regulation”. The panel, which was moderated by Gianclaudio Malgieri, also counted on Mireille Hildebrandt, Louisa Klingvall (European Commission), Ivana Bartoletti (University of Oxford), and Brando Benifei (co-rapporteur of the AI Act at the European Parliament). It explored how the AI Act draft proposed to protect vulnerable individuals, by prohibiting the exploitation of some forms of vulnerability (based on age, disability, economic and social conditions): could the definition of vulnerability under the text be broadened?
The occasion also served as an opportunity to announce that FPF and the Brussels Privacy Hub will set up an International Observatory on Vulnerable People in Data Protection (the ‘VULNERA’ project) and offered a preview of its website. More details to follow in the following months.
May 23: Global panel on post-COVID data protection; AI Act in the employment context
The first day of the CPDP conference was a busy one for FPF’s Gabriela Zanfir-Fortuna. She started early in a speaking role on a panel about ‘Data Protection Regulation Post-COVID: the Current Landscape of Discussions in Europe, the US, India and Brazil’, organized by Data Privacy Brasil (DPB) and moderated by DPB’s Bruno Bioni. The session, during which Gabriela offered the US perspective on the matter, also counted on the valuable inputs of FPF’s Senior Fellow for India, Malavika Raghavan, the European Data Protection Board (EDPB)’s Head of Secretariat, Isabelle Vereecken, and the Executive Director of the Africa Digital Rights Hub LBG, Teki Akuetteh Falconer. Panelists reflected on new questions for data sharing and protection that had arisen in their regions in areas such as public health (including the design of contact tracing and health passport apps), education, and welfare/social security. View a recording of the session here.
During the last slot of the day, Gabriela moderated a panel on ‘The AI Act and the Context of Employment’, which saw a lively debate on the extent to which the draft Regulation protects workers against AI-powered workplace monitoring and decisions. The panelists in this instance were Aida Ponce del Castillo (European Trade Union Institute), Diego Naranjo (Head of Policy at European Digital Rights), Paul Nemitz (Principal Advisor at the European Commission’s DG JUST), and Simon Hania (Data Protection Officer at Uber). You can read about the main points raised by the speakers in this short thread.
May 24: Cross-Continental privacy compliance and FPF’s panel on Mobility Data
The second day was packed with interesting discussions on topics such as GDPR enforcement conundrums, privacy class actions, and how data protection law can tackle manipulative web design (or ‘dark patterns’). FPF staff were involved in some of the most exciting panels and events of the day.
FPF’s Policy Analyst, Mercy King’ori, was a speaker at a panel at the CPDP Global track, on ‘Corporate Compliance with a Cross Continental Framework: the State of Global Privacy in 2022’. As Mercy elaborated on African regulatory developments, the remaining speakers focused on different jurisdictions, such as the EU, Brazil, China, and Israel. The debate, which was moderated by FPF’s Senior Fellow, Omer Tene, also counted on contributions from Renato Leite Monteiro (DPB), Barbara Li (PwC), and Anna Zeiter (Chief Privacy Officer at eBay).
Later that evening, the conference hosted the session organized by FPF, on the topic of ‘Mobility Data for the Common Good? On the EU Mobility Data Space and the Data Act’. The panel was moderated by FPF’s Managing Director for Europe, Rob van Eijk, and aimed to answer several questions, including whether the draft Data Act and the upcoming EU Mobility Data Space could address cities’ innovation and sustainability goals, while still safeguarding citizens’ privacy. The expert speakers around the table were Maria Rosaria Coduti (Policy Officer at the European Commission’s DG CNECT), David Wagner (German Research Institute for Public Administration, or “FÖV”), Laura Cerrato (DPO at the Centre d’Informatique pour la Région de Bruxelles), and Arjan Kapteijn (Senior Inspector for the department of Systemic Oversight within the Dutch DPA). View a recording of the session here.
Maria Rosaria Coduti explained that the combination of different pieces of the EU’s Data Strategy – notably, the Data Governance Act (DGA), the Data Act and the Common European Data Spaces – seeks to remove barriers to the access and sharing of data. This can be achieved through incentivizing private and public sector players, as well as data subjects, to share data on a voluntary basis (e.g., through data intermediation services and data altruism organizations), as well as compelling entities to share data where there is an imbalance of power between data holders and users, or where public interest grounds exist. An example of the latter case is the use of mobility data held by telco providers to help mapping the spread of COVID-19. However, the Data Act defines strict rules for data access requests made by public bodies to private players, and a limitation on the use of such data to public emergency situations. With regards to Business-to-Business data sharing under the Data Act, Coduti underlined that the text’s provisions on cloud switching and interoperability may force designers of connected products (such as cars, planes, and trains) to design them in a way that makes the data they generate easily accessible for users and the data recipients the latter choose.
Laura Cerrato explained that, in her role as DPO for the IT services provider of the Region of Brussels’ public authorities, she invests efforts in explaining the legal intricacies of data sharing to such authorities. According to Cerrato, the Data Act will open new possibilities for government bodies to access privately-held data, but that this requires transparency and accountability toward citizens. Moreover, as her office is piloting a Mobility-as-a-Service project in the city, there was a need to discuss the appropriate legal basis for personal data processing in that context, as public authorities cannot rely on the legitimate interest ground under Article 6 GDPR. In that respect, Cerrato underlined that the public interest legal basis can only be used if it is provided under national or EU law, which for Smart City development in Belgium was lacking.
Arjan Kapteijn closely followed Cerrato’s remarks, pointing toward the recent Dutch DPA guidance on Smart Cities. In the lead up to such recommendations, the DPA investigated the records of processing activities (ROPAs) and data protection impact assessments (DPIAs) of 12 Dutch cities carrying out Smart City-related projects and asked why they did not consult the DPA prior to the data processing, as per Article 36 GDPR. Among the DPA’s findings, there were some misconceptions among municipalities regarding the concept of “personal data” when applied to mobility datasets, and a belief that the GDPR did not apply to pilot projects, which may have led to lack of transparency toward citizens. Kapteijn stressed that data collected through sensors is often covered by the GDPR, such as data collected by connected vehicles and smart traffic lights, and in the context of wi-fi tracking in public spaces. Lastly, the speaker warned about the difficulties of making location data truly anonymous according to GDPR standards, and that certain hashing techniques, privacy-by design, and data minimization may play a valuable role in retaining data utility while protecting the data.
Lastly, David Wagner focused on the concept of anonymization under Recital (26) GDPR, and how it applies to location and mobility data. He explained that anonymising this data is hard because individual movement patterns can identify persons. To work towards anonymising the data, there is a need to suppress some data points (e.g., by adding noise), with losses for data utility. The anonymization test in Recital (26) GDPR, which considers the “means reasonably likely to be used” to identify a natural person, arguably invites controllers to evaluate potential attackers’ cost-benefit calculations, making it hard to determine a reasonable re-identification attempt. Thus, Wagner argued that the GDPR defines a threshold for anonymity, but that controllers and regulators need an effective and reliable scale to assess it. The upcoming update from the EDPB to the 2014 Article 29 Working Party guidelines on anonymization may provide such evaluation criteria.
May 25: FPF’s De-Identification Masterclass and Data Protection in China
In the morning of the conference’s closing day, FPF hosted an engaging and well-attended Masterclass on the ‘State-of-Play of De-Identification Techniques’ as an official side event. The session’s moderator, FPF’s Rob van Eijk, kicked off the discussion by presenting the 2016 FPF Infographic on data de-identification, and how this fared against the GDPR’s updated concept on anonymization. Then, expert speakers Sophie Stalla-Bourdillon (Immuta), Naoise Holohan (IBM), and Lucy Mosquera (Replica Analytics) presented on cutting-edge techniques, notably – and respectively – on Homomorphic Encryption, Differential Privacy, and Synthetic Data. View a recording of the session here.
Just before the closing panel and remarks, FPF’s Policy Counsel, Hunter Dorwart, presented his selected academic paper ‘Chinese Data Protection in Transition: A Look at Enforceability of Rights and the Role of Courts’ at the last Academic Session of the conference. This paper examines recent privacy litigation in China and attempts to situate such litigation within China’s larger governance structure in order to examine an underappreciated facet of Chinese data protection law: the role of courts.