CPDP2021 Event Recap: Bridging the Standards Gap
On January 27, 2021, the Institute of Electrical and Electronics Engineers (IEEE) hosted a panel at the 14th Annual International Computers, Privacy, and Data Protection Conference (CPDP2021). The theme for this year’s online conference was “Enforcing Rights in a Changing World.” Rob van Eijk, FPF Managing Director for Europe, moderated the panel “Technical Standards Bringing Together Data Protection with Telecommunications Regulation, Digital Regulations, and Procurement.”
The recording of this panel is available here.
The panelists discussed the role of technical standards in ensuring the systematic application of data protection principles across policy areas. Two examples are Article 25 GDPR and Recital 78, which stipulate data protection by design and by default. Another example is Article 21(5) of the GDPR, which stipulates that in the context of the use of information society services, and notwithstanding Directive 2002/58/EC, the data subject may exercise his or her right to object by automated means using technical specifications.
Technical standards seek to ensure that engineers can effectively apply privacy and data protection principles in the design and the (default) configuration of technology placed on the market. Therefore, a key question for the panel discussion was: How to bridge the gap between releasing technical standards and embedding them into products and services available on the market?
Facilitating Dialogue and Collaboration Between Policymakers and Technologists
Paul Nemitz, Principal Advisor in the Directorate-General for Justice and Consumers at the European Commission, started the discussion with a precise observation. He argued that building a closer, collaborative relationship between engineers and policymakers is a critical step to bridge the gap between data protection policies, technical standards, and practical application. He called on the technical intelligentsia to bring their knowledge to the policymaking process, stating that democracy needs the engagement of the engineers. Paul also expressed that policymakers should hold out their hands and open their doors to bring in those who know the technology. He identified convincing those with the technical knowledge to also engage in the formal lawmaking process as one of the challenges of our time.
He clarified that he was not claiming engineers know how to make policies better than policymakers. Instead, he defined specific roles for each group based on their areas of expertise and working languages. The technologists’ part is to inform and educate policymakers to be able to shape laws that are technology-neutral and do not have to be updated every two months. According to Paul, while engineers are adept at writing code, code is written for machines that cannot think for themselves. However, the law is written for people, not computers, and policymakers are best equipped for this language. Paul also sees the relationship as a two-way street where technologists can bring their technological knowledge to the rulemaking process then go back to their spaces with a better understanding of democracy.
Clara Neppel, Senior Director of IEEE European Business Operations, shared that all IEEE standardization bodies have a government engagement program where governments can inform the IEEE standardization organization about their policy priorities. That allows software engineers to take into account upcoming legislative measures.
Amelia Andersdotter, Dataskydd.net and Member of the European Parliament 2011-2014, stressed the importance of ensuring that standards – once set – are transparent and effectively communicated to everyone in the ecosystem, i.e., from procurement bodies to end-users. For instance, when the public sector seeks to build a public WiFi network or a company designs a website, those demanding the products should be aware of the frameworks in place that protect human rights and data as well as the right questions to ask of those further up in the value chain (See also the FPF & Dataskydd.net Webinar – Privacy in High-Density Crowd Contexts). The IEEE P802E working group has been drafting Recommended Practices for Privacy Considerations for IEEE 802 Technologies (P802E). P802E contains recommendations and checklists for IEEE 802 technologies developers).
Panelists left to right, top row: Rob van Eijk, Paul Nemitz, Clara Neppel; bottom row: Amelia Andersdotter, Mikuláš Peksa. We remark that Francesca Bria (President of the Italian National Innovation Fund) contributed to the preparation of the panel discussion.
Privacy versus utility
According to Clara, key challenges faced when implementing privacy include the lack of common definitions, the tradeoff between privacy and functionality of the products or services, and reaching the right balance between social values embedded in data governance and technical measures. Clara pointed out that privacy can have different meanings for different parties. For example, a user may understand privacy as meaning no data is shared, only anonymous data is shared, or being able to define different access levels for different types of data. On the other hand, companies may interpret privacy as solely being compliant with relevant laws or as a true value proposition for their customers.
Amelia also identified the lack of specification of data protection principles or requirements in legislation as leading to confusion and inefficiencies for engineers and companies. She cited the case of the European Radio Equipment and Telecommunications Terminal Equipment Directive (R&TTE), adopted in 1999, which includes data protection as an essential requirement for radio technologies. However, what this requirement actually means has never been specified, resulting in companies being unable to assess whether their technologies meet the requirement. She expressed that Europe is good at establishing both the high-level values and low-level practical architectures, but could improve if regulators filled the gap and created a framework for assessing privacy.
Amelia also suggested that the diversity and flexibility of technical standards may encourage more experimentation and optimization as organizations choose which standards work best for their specific technologies and in their specific contexts.
Rob added that technologists need bounded concepts with definitions embedded in law, so they are clear and no longer open-ended. Then, the real work of figuring out how the technology should be designed to protect privacy can take place.
Rulemaking: Looking to the Future Rather Than in the Rear Mirror
Mikuláš Peksa, Member of the European Parliament, expressed skepticism with the European Parliament micromanaging technical standards. Instead, he leaned towards Parliament conveying basic values that protect human rights but not shaping the standards themselves.
Mikuláš contended that politicians may talk about past problems and, due to structural and procedural factors, politics always reacts with a certain delay. He stated that human society is in another Industrial Revolution where all human activities are changing. Therefore, society needs to adapt to be able to absorb and digest the changes. He referred to the example of online spaces for conversations being monopolized by a single company. He questioned whether this is the model society really wants and suggested exploring the idea of a decentralized architecture for social networks, like that of other mail services, to protect against centralized control.
Paul explained that data protection faces the classic syndrome of an ‘invisible risk’ due to the fact that people are unable to see themselves as victims. Where people fail to see the ‘invisible risk’, such as with atomic power, smoking, or data processing, politics arrives far too late. Looking forward, Mikuláš acknowledges that there is no single silver bullet to solve all of the problems. Therefore, Europe needs the political organization to incorporate the technical intelligentsia and find a desirable general direction to drive towards.
In a tour-de-table, the panelists presented their answer and next steps on how to bridge the gap between releasing technical standards and embedding these into products and services available on the market.
Clara remarked that we may need policymakers to engage in technical considerations like standard setting as well as provide legal certainty clarifying questions around data control and correction. We may also need new standards, certifications, and good mechanisms to address special needs for specific contexts. For instance, when it comes to synthetic data, which is estimated to be 40 percent of all training data in two years, we lack metrics on how to measure the quality in terms of privacy, in terms of accuracy.
Amelia argued that in addition to work done by consumer protection authorities, data protection authorities, and telecommunications regulations, one could also just take the initiative in a company or a space. There are already industry efforts to ensure technologies are more privacy-sensitive and robust. Rome wasn’t built in a day and she doesn’t think our infrastructures will be either, but we can take small steps in the right direction and we will be in a better place in two or three years than we are now.
Mikuláš stated that there is no one silver bullet that will resolve all of the problems. He argued that we need a political organization that will incorporate technical intelligentsia, an organization that will be able to steer the general direction of building this new information society. Without it, we will not be able to introduce the standards in a proper way.
In closing, Paul called upon people who deeply understand the intricacies of technology to engage with the democratic legislative process of law making. In this way, they may be able to not only bring technological knowledge to the rulemaking process but also incorporate a better understanding of the realities of democracy into code.
FPF contributions to the privacy debate at CPDP2021
The recording of this panel is available here.
This year, FPF contributed to seven panels. The FPF team participated in the following panels (in alphabetical order):
– Dr. Carrie Klein (panelist), Ready for a crisis: accelerated digitalization in education (organized by VUB Data Protection on the Ground);
– Dr. Gabriela Zanfir-Fortuna (moderator), US privacy law: the beginning of a new era (organized by FPF);
– Dr. Gabriela Zanfir-Fortuna (panelist), Augmented compliance: the case of Algorithmic Impact Assessment (organized by EPIC);
– Dr. Gabriela Zanfir-Fortuna (panelist), International Data Transfers: What shall we do to avoid Schrems III? (organized by NOYB);
– Jasmine Park (moderator) & Amelia Vance (panelist), Panel on Global Youth Privacy: Amplifying youth needs and voices (organized by Privacy Salon);
– Jules Polonetsky (moderator), EU Digital Strategy, a holistic vision for a digital Europe (organized by CPDP);
– Dr. Rob van Eijk (moderator), Technical standards bringing together data protection with telecommunications regulation, digital regulations and procurement (organized by IEEE).
To learn more about FPF in Europe, please visit fpf.org/about/eu.