Event Report: FPF Side Event and Workshop on Privacy Enhancing Technologies (PETs) at the 2022 Global Privacy Assembly (GPA)
The 2022 Global Privacy Assembly (GPA) – which brings together most global data protection authorities (DPAs) every year since 1979, to share knowledge and establish common priorities among regulators – took place between October 25 and 28, in Istanbul (Türkiye). The Future of Privacy Forum (FPF) was invited by the organizers of the GPA (the Turkish DPA) to host a two-part side event during the GPA’s Open Session (on October 25 and 26), in addition to a capacity building workshop for regulators during the Closed Session (on October 28).
These sessions covered the topic of Privacy Enhancing Technologies from three different approaches:
- The regulators’ take: PETs are promising, but no silver bullet. The first part of the FPF Side Event offered the regulators’ perspective, and was titled ‘Regulatory Views on the Role and Effectiveness of PETs’. The session was moderated by Limor Schmerling Magazanik, Director of the FPF-affiliated Israel Tech Policy Institute (ITPI), and counted on the contributions of Rebecca Kelly Slaughter (Commissioner at the US Federal Trade Commission, or ‘FTC’), Tobias Judin (Head of the International Department of the Norwegian DPA, the ‘Datatilsynet’), Gilad Semama (Privacy Commissioner of Israel), and Vitelio Ruiz Bernal (Director of Supervision at the Mexican DPA, the ‘INAI’).
- The view of practitioners: a call for regulatory clarity and predictability. The second part of the Side Event was entitled ‘Lessons Learned from Implementing PETs’, and saw various privacy leaders from the industry share their experiences of leveraging PETs in their compliance efforts. The panel was moderated by FPF’s CEO Jules Polonetsky and had as panelists Anna Zeiter (Chief Privacy Officer at eBay), Emerald De Leeuw-Goggin (Global Head of Privacy at Logitech), Barbara Cosgrove (CPO at Workday), and Geff Brown (Associate General Counsel at Microsoft).
- FPF’s capacity building workshop. The FPF workshop during the GPA’s Closed Session was conducted by FPF’s Vice President for Global Privacy, Dr. Gabriela Zanfir-Fortuna, and Managing Director for Europe, Dr. Rob van Eijk. The session covered the legal qualification of PETs under the EU’s data protection framework – as well as how they could be leveraged to attain compliance with it -, as well as a primer on three PETs, i.e., Differential Privacy, Synthetic Data, and Homomorphic Encryption. This workshop was a condensed version of the Masterclass that FPF hosted at the 2022 Computers, Privacy and Data Protection (CPDP) Conference last May (recorded).
Below we summarize the discussions in the two FPF Side Events with regulators and privacy leaders and highlight key takeaways.
The regulators’ take: PETs are promising, but no silver bullet
Moderator Limor Schmerling Magazanik opened the first discussion by observing that regulators have a dual role regarding PETs: issuing guidance to clarify when and how PETs should be deployed in different scenarios to ensure compliance with privacy laws; and providing tailored advice to lawmakers that wish to promote the use of PETs for the pursuit of public interest tasks and the responsible use of data.
On this note, Gilad Semama noted that PETs seem to present solutions for combining innovation in the tech sector with the protection of privacy as a Constitutional right in Israel. Semama highlighted that companies have expressed their need for certainty on how they can use PETs to achieve compliance with the privacy framework. The speaker added that it is challenging to find a one-size-fits-all solution in this respect, but that the Privacy Commissioner is trying to pass flexible guidance and answer the public’s queries on PETs for the benefit of businesses and DPOs, by referring to accountability and helping them choose the most appropriate PET for specific use cases. According to Semama, PETs should be complemented with other data security solutions to provide meaningful protections. On the other hand, he noted that companies that are developing PETs in Israel need access to funding and that a recent joint project from the regulator and the Innovation Authority of Israel may be of help.
Next up, Rebecca Kelly Slaughter stressed the potential that PETs might offer in promoting competition and consumer protection, as they can represent innovation and a positive metric of competition. However, some applications of PETs can be misleading and competition-inhibiting. This means that, according to Slaughter, the value of PETs should be assessed against their concrete effects. The Commissioner stated that the FTC should mainly focus on providing guidance to assist businesses developing and implementing PETs through FTC rulemaking, instead of strict enforcement. However, the FTC will not approve broad safe harbor provisions for the use of specific PETs, as their effectiveness is generally context-specific.
Slaughter suggested PETs could enable the implementation of privacy-preserving age verification systems, although the FTC is yet to see such a solution. This would enable businesses to move away from notice and consent-based standards regarding the processing of children’s data, which is one of the current aims of the FTC. According to Slaughter, consent does not provide adequate protections to children’s online privacy, and providers should rather focus on data minimisation, purpose and storage limitation.
The FTC is currently receiving comments to its proposed Consumer Surveillance and Data Security Rulemaking, which also touches on PETs. The contributions to the public consultation promise to offer a compendium of perspectives for several stakeholders to tap into when developing and implementing PETs. In addition, Slaughter admitted that the FTC needed to build collaboration with and draw inspiration from regulators in different jurisdictions, also when it comes to issuing enforcement orders. As companies will roll-out PETs across borders, consistent regulatory approaches will increase the likelihood of broad uptake of PETs by small and large players.
Tobias Judin followed up on Slaughter’s comments, by saying that, when it comes to greenlighting PETs, DPAs should explain that companies do not need to choose between data collection and privacy, or between innovation and data protection. Judin used health research as an example, outlining that often researchers need to collect data about rare diseases across jurisdictions to make the dataset more representative, even knowing that the level of data protection is not equivalent in all targeted countries. In that context, PETs such as homomorphic encryption or differential privacy may provide reassurance to research subjects. Judin also stressed that confidential computing can mitigate security vulnerabilities that often exist when research data is stored on premises and not in the cloud.
Judin also elaborated extensively on federated learning, which allows controllers to check their data processing systems for bias through careful analysis of larger datasets. He stated that the application of federated learning to an AI model’s training data can be done within users’ devices. He gave the example of Google’s GBoard, which enabled the company to make predictions about what individuals wanted to type, without the data leaving their device.
Another example is how the Norwegian DPA advised banks within its regulatory sandbox for responsible AI to cooperate when training their money-laundering detection algorithms. As banks do not normally have enough ‘suspicious’ customers to train their detection algorithms, they tend to be overzealous, which leads to false positives and data protection issues. However, the DPA noted that banks could cooperate in the development of a more effective algorithm without sharing raw data about their customers by using differential privacy, as long as they prevented model inversion attacks. The DPA also conceded that banks needed to tweak the model and the underlying training and input data as they went along to ensure the algorithm’s effectiveness, which should reassure diligent AI developers against the risk of fines.
Lastly, Vitelio Ruiz Bernal stressed the importance of helping businesses achieve security standards that can help them comply with data protection law. In this respect, he mentioned the INAI’s data protection laboratory, which is dedicated to analyzing apps and web-based applications that are subject to a black-box. The INAI has found that processors which assist controllers in those contexts are often under-resourced and reluctant to use PETs due to their perceived high costs. Bernal revealed that the INAI is currently looking for public-private collaborations to develop accessible PETs and to issue guidelines on specific PETs (e.g., encryption), also inspired by the work of the Berlin Group on the matter. Given Mexico’s specific legal requirements in terms of cloud service security, Bernal mentioned that PETs could potentiate the uptake of cloud services by increasing trust among stakeholders.
The view of practitioners: a call for regulatory clarity and predictability
To frame the second panel of the Side Event, Jules Polonetsky reflected on the privacy community’s eagerness to learn about how industry privacy leaders are integrating PETs into their compliance strategies, their successful and less successful stories. On the other hand, Jules queried the panelists about the actions they would like to see from regulators and policymakers in this space to promote the uptake of PETs.
Anna Zeiter revealed that eBay has had meetings with its lead DPA in Germany about how PETs could help them comply with the Court of Justice of the European Union (CJEU)’s Schrems II ruling on international data transfers, in particular on the implementation of supplemental measures in accordance with the European Data Protection Board (EDPB)’s guidance. In that context, the DPA focused on measures such as tokenization and encryption (in transit and at rest).
Zeiter highlighted the UK Information Commissioner’s Office (ICO)’s PETs guidance, and said this constituted an opportunity for other regulators to evaluate where they stand on the matter. The speaker also called for a global alignment from DPAs, because companies will implement PETs across very different jurisdictions. Zeiter claimed that, for companies to know whether they should invest in PETs, regulators need to give them reassurance, for example in the form of some sort of PET ‘whitelist’ in particular contexts of application. Additionally, Zeiter underlined that companies that develop and use PETs and their DPOs have a role in educating regulators, which was echoed by a DPA official in the room.
Emerald De Leeuw-Goggin mentioned Logitech’s offerings of PETs as a service for its internal teams of software developers. According to the speaker, this involved making PETs more accessible and scalable within the wider decentralised organisation, the development of privacy engineering capabilities, and the buy-in of Chief Technology Officers. De Leeuw-Goggin noted that PETs are still not mainstream enough for an SME owner to feel confident investing and implementing them, also due to the existing skills gap in the field. As PETs become mainstream, they will also become more understandable and usable across sectors and company sizes.
Barbara Cosgrove stated that B2B companies like Workday tend to receive questions from their customers on how to best implement PETs into their software solutions. This includes masking or pseudonymizing data, or limiting employee access to data. Sometimes, more sophisticated measures – like differential privacy – could be adequate, but companies are reluctant in investing resources in the absence of regulatory clarity, particularly on de-identification. Cosgrove agreed that businesses and regulators need to put their brains together in developing use cases and standards that would increase legal certainty around the effective use of PETs. Co-regulatory solutions like Codes of Conduct could facilitate demonstrations that PETs are used in a compliant manner.
Finally, Geff Brown highlighted how differential privacy has become usable in multiple apps, allowing providers to process aggregated telemetry data at scale for analytics. Microsoft is using the technique to improve its Natural Language Processing models, including text and speech prediction. In that context, differential privacy allows companies to demonstrate the accuracy of the model without compromising individuals’ privacy. Brown argued that tech savvy companies need to better explain PETs to consumers and corporate customers, but that standardization efforts and favorable DPA positions can also help. In this context, Geff wished for an EDPB update to the 2014 guidance on anonymization, and to have regulators carry out PET testing and share the results with the public, thereby increasing knowledge and trust in the technologies.
Further reading:
- A Visual Guide to Practical Data De-Identification (FPF, 2016)
- Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade (FPF, 2020)
- FPF Files Comments on White House Office of Science and Technology Policy Actions to Advance Privacy-Enhancing Technologies (FPF, 2022)