FPF Submits Comments to United Nations Ahead of 2021 Special Report on Child Privacy

Last week, the Future of Privacy Forum (FPF) submitted comments to the United Nations Office of the High Commissioner for Human Rights Special Rapporteur on the right to privacy to inform the Special Rapporteur’s upcoming report on the privacy rights of children. 

The Special Rapporteur’s report, expected in March 2021, will focus on how privacy affects the evolving capacity of the child and the growth of autonomy, and what factors enhance or constrain this development.

FPF’s comments focus on encouraging the Special Rapporteur to consider two key points in the development of their report, including: 

  1. How child privacy legislation can and should react to actual harms, and not unsubstantiated fears, in order to avoid unintended consequences that may impact the rights of children to benefit from and participate in the online ecosystem; and 
  2. How child privacy policies must consider and balance competing and evolving interests between children and other authority figures such as parents or teachers, and recognize the need to foster resilience and autonomy in children by helping them develop digital skills.

Additionally, FPF’s comments suggest that the Special Rapporteur’s report include a discussion on the need for schools, districts, and their third-party vendors to be transparent about data and technology use, storage, analysis, and purpose with children, parents, and other relevant stakeholders. 

Transparency around data and technology use has become particularly urgent in recent months as millions of children around the world shifted to some form of online or distance education when the COVID-19 pandemic closed many school buildings in early 2020. Since that time, FPF has developed and compiled student privacy and COVID-19-related resources for school leaders, policymakers, teachers, and students and their families on its student privacy-focused website, StudentPrivacyCompass.org

By sharing our expertise and insight on the US student privacy landscape and its long history of unintended consequences, the importance of balancing the interests of children with authority figures, and the critical need for fostering an environment of transparency and trust, FPF hopes to help inform a thorough and thoughtful report on the privacy rights of children by the Special Rapporteur next spring. We look forward to discussing these recommendations and others with child privacy stakeholders in the U.S. and around the world in the coming months. 

READ COMMENTS

Chelsey Colbert Discusses Trends in Mobility & Location Data

Chelsey Colbert, Policy Counsel, leads FPF’s mobility and location data portfolio. Prior to joining FPF, Chelsey was a lawyer at an international business law firm in Canada and was seconded as in-house privacy and data governance counsel to Sidewalk Labs, an Alphabet company that designs and builds urban innovations. Chelsey holds a J.D. with a major in technology law and policy from the University of Ottawa.

Can you tell us about your career and what led you to FPF?

While I was in law school, one of my professors, the late Ian Kerr, sparked my interest in AI and robotics and he has influenced my career path towards AI and robotics. After law school, I worked as in-house counsel for Sidewalk Labs while on secondment from a law firm. I’m really passionate about cutting edge technologies that blend the digital and physical worlds and there’s no better or more exciting space than mobility and smart cities for me to exercise that interest. I’m really optimistic about the potential for robots and automated vehicles to positively shape our future, particularly in terms of saving lives, increasing efficiencies, and providing humans with conveniences. At the same time, I recognize that there can be ethical and privacy harms to individuals and society through the irresponsible use of technologies. I believe there are ways to mitigate risks and reduce or eliminate harms to get as many benefits from technologies like automated vehicles and digital micromobility services, like bike- or scooter-sharing.

FPF has been a really great fit for me because I get to be deeply involved in some of the most exciting privacy and data-related challenges of our time. I enjoy getting to work with a variety of stakeholders, including government, industry, and academia.  My work at FPF presents me with a unique opportunity to help shape our mobility future to be more equitable, accessible, safe, and affordable. I believe that the multitude of mobility options – from connected and autonomous vehicles (CAVs) to e-scooters to delivery robots – present many benefits and implications for society. My portfolio at FPF covers the entire range of privacy and data topics in the mobility space, and it’s rare to have a day go by where there isn’t something newsworthy coming out.

What attracted you to working on aspects of law related to AI, privacy, and mobility?

AI and privacy as a field is both niche and extremely broad. Some of the reasons why I’ve become interested in AI and privacy with respect to mobility specifically are that the technical and regulatory challenges of autonomous vehicles are fascinating and raise a lot of questions related to privacy and data protection. I’ve also found that mobility data is quite varied in nature and, because of that, raises really complex and contextual privacy challenges. The mobility ecosystem requires collaboration and input from the various levels of government, communities, companies, and international partners. Working with such a wide variety of stakeholders has been a highlight of my time at FPF.

What are the hot-button issues that you’re working on related to mobility?

Currently, I’m working with FPF stakeholders on a report that will outline privacy-by-design goalposts for connected and autonomous vehicles. The report will take a deep dive into connected and autonomous vehicle technologies that are essential for the safe development of these cars, including optical sensors, computer vision, and geolocation and HD mapping. I believe that just as safety and security should be built into the design of cars, so should privacy.

Mobility data sharing is another fascinating and dynamic area. One example is the sharing of micromobility data between companies and cities through the Mobility Data Specification (MDS). MDS is a set of open-source APIs that allows the data from a vehicle to be shared – almost in real-time – with city governments, typically the DOT. A big part of my work in this space is understanding the privacy implications of types of mobility data, which often includes personal data and location data. There are interesting questions about the temporal nature of mobility data, whether mobility data can be used to identify individuals, and the sharing of data between the private sector and the public sector, where each sector often has different obligations and responsibilities. Another fascinating development is the Right to Repair movement, which illustrates the potential conflicts of access to and control over vehicle data.

What mobility trends will we all be talking about in the next couple of years?

We’re seeing increased interest in the space from regulators across the globe, particularly in the larger markets. Regulators are concerned about safety regulations and standards. In the United States, for example, the National Highway Traffic Safety Administration recently announced an online, centralized platform for cities and companies to voluntarily post updates about their automated vehicle testing. After stalling in 2017, the SELF DRIVE Act was recently reintroduced, which would create a federal framework for the regulation of autonomous vehicles (AVs) and includes a section on privacy policies. We may see this or a new AV bill become law in 2021 and this will have implications for privacy and data protection.

There is also a lot of policy and legislative action in state and local governments in terms of mobility data sharing, access to mobility data, and general privacy laws that impact CAVs. Also, in Europe, the European Data Protection Board is expected to intensify its work in this area and recently released draft guidelines on connected cars. We’re also seeing advanced driver-assisted technologies and driver monitoring systems become required in car safety regulations and standards, which has implications for privacy and data protection. Regulations and standards are important from a technical perspective, because safety standards often drive the development of new technology. When manufacturers develop the technology to reach those standards, it’s important that companies (and governments) implement privacy by design in the technologies and in their organizational practices.

The volume of data coming from connected cars and other forms of mobility will increase, as will its value. There will be more opportunities to monetize this data, but it’s a really complex ecosystem with many players, all of which need to be strategic and thoughtful about privacy. We’re seeing the industry continue to move away from siloed operations, toward more collaboration, consolidation, and partnership in the mobility space as stakeholders – including car manufacturers, mapping companies, driverless technology companies and others – recognize the importance of collaboration to unlock the full potential of mobility data. This all makes for a really complex ecosystem, which means there is a lot for businesses, consumers, and policymakers to navigate. Some of the data is relatively benign from a privacy perspective, while other data is very sensitive, and some data could become personal and sensitive depending on the context, such as location data. Companies and governments collecting mobility data must ensure that their staff have a deep understanding of the privacy implications of new technologies and data types. Privacy-by-design and a cross-functional approach are some of the best ways to address these complex, multifaceted privacy issues.

I’m also expecting to see more public-private partnerships in this space. The private sector and all levels of government must have a relationship with each other to ensure that there is a constructive dialogue on privacy, safety, and technical standards. Municipal governments are an important part of the ecosystem, in addition to state and federal governments. Mobility data provides many social and monetary benefits and can also be used or misused in ways that have unintended harmful consequences for individuals and society. Companies and governments using and benefitting from mobility data should both be held to higher standards of privacy and data protection. The public should also have a voice in the policymaking and innovation process. Better transparency about how technology is being developed and the mitigation efforts to reduce harms could help improve consumer trust and the adoption of newer technologies such as delivery robots and automated driving features.

I expect the next several years in mobility law and policy to be very interesting and challenging. I consider myself to be very lucky to work in such a dynamic area of technology law and policy and am hopeful for a future with more (human friendly) robots in it.

FPF & Dataskydd.net Webinar – Privacy in High Density Crowd Contexts

Authors: Hunter Dorwart and Rob van Eijk

On 30 September 2020, Future of Privacy Forum (FPF) and Dataskydd.net jointly organized the webinar ‘Privacy in High Density Crowd Contexts’. A key aspect of the webinar was the role of industry-driven privacy standards in the development and deployment of privacy-friendly technologies for crowd management, mobile connectivity, and smart city services.

Keywords: IEEE P802E Recommendations, Privacy by Design, Certification, Standardization

Speakers (in alphabetical order)

The recording is available here.

Privacy-Preserving Technologies and Standards Development

Many industry-driven standards development organizations (SDOs) have been working to improve the general privacy qualities of both network infrastructures and web infrastructures. Generally speaking, these bodies focus on how industry standards can remedy design flaws in technologies in order to facilitate transparency, foreseeability, and the ability of consumers to both opt out and make decisions about the way they can interact with technical infrastructures.

Some of these standards have been quite successful and continue to show a lot of promise to address emerging privacy issues in technology. For instance, the Internet Engineering Task Force (IETF), which is responsible for the bulk of what consumers see in a web browser or in an email, considers privacy issues when developing Internet standards through, among other things, the RFC6973 Privacy Considerations for Internet Protocols. In addition, many technical bodies are attempting to introduce encryption and minimization standards to ensure that network protocols do not generate identifiers beyond what is necessary for the network to function.

As the world becomes more digitized, SDOs will continue to develop privacy-preserving standards. However, challenges are beginning to emerge regarding the widespread adoption of these standards and how industry-setting bodies such as Internet Engineering Task Force (IETF), the Institute of Electrical and Electronics Engineers (IEEE), 3rd Generation Partnership Program (3GPP), and the World Wide Web Consortium (W3C) interface with EU-level policymaking. As it is, there is a real risk that the underlying policy goals of governments may conflict with future standards setting and that lack of coordination across SDOs and between governments may hinder consistency and interoperability.

Striking a balance between protocols that need some type of identifier to function and the level of privacy exposure that might result has created difficulties for embedding privacy within standards. In order to enable seamless communication between routers and devices, protocols that govern access points within the routers must trace identifiers unique to a product device. Because devices are often linked to an individual user, the network protocols that facilitate communication invariably expose data about these users. The IEEE navigates this trade-off by providing technical solutions to help developers make this seamless flow of communication and data-sharing more aligned with public policy goals.

IEEE Recommended Practices for Privacy Considerations for IEEE 802 Technologies

To this end, the IEEE P802E working group has been drafting Recommended Practices for Privacy Considerations for IEEE 802 Technologies (P802E). P802E contains recommendations and checklists for IEEE 802 technologies developers (Figure 1). The approach builds on Section 7 questionnaires of RFC6973 Privacy Considerations for Internet Protocols adapted to the IEEE 802 environment which considers harms and risks to privacy when developing network protocols.

Figure 1 – Overview of P802E applications (slide from the presentation by Jerome Henry)

The purpose of the P802E recommendation is ‘to promote a consistent approach by IEEE 802 protocol developers to mitigate privacy threats identified in the specified privacy threat model and provide a privacy guideline.’  In order to strike the right balance between functionality and privacy, the IEEE focuses on the context of device use. For instance, personal devices make it easier to identify the user through network traffic routing while shared devices generally do not. The rubric for developing standards therefore changes depending on how users will interface with the device.

IEEE 802 LAN standards specify the operation of media access control (MAC) methods and protocols that support frame-based network communication. MAC procedures and various protocol frame formats and fields can be used to identify personal devices, their attributes, and their use to support specific networking applications and activities. An adversary can use this information to obtain (location) information about an individual. Other possible threats in IEEE 802 LAN standards are, e.g., flow identifiers, optional fields in the standard, network discovery flows and patterns, ranging exchanges, authentication flows, directed queries, frame timing, and frame structure. Figure 2 illustrates the threat of location services systematically tracking (mobile) access points.

Figure 2 – Example of the threat of location services systematically tracking (mobile) access points  (slide from the presentation by Marit Hansen).

Speed to Market 

As standards evolve, policymakers and industry must work together to minimize the speed to market. This includes facilitating the adoption of privacy by design and establishing appropriate benchmarks for matching regulatory compliance with technological design. As it is, one major challenge facing public-private partnerships is how fast the technological landscape changes. Putting in place the infrastructure to adapt standards to novel privacy challenges becomes difficult as more IoT devices proliferate throughout the economy and make tracking scenarios much larger than before.

Standardization and Certification

From the perspective of industry, many stakeholders in the broadband industry see the importance of implementing privacy policies throughout their services and recognize the demand from policymakers. Like standards bodies, network operators must also strike a balance between functionality and privacy-preserving practices. But this doesn’t have to be a zero-sum tradeoff. Organizations can push for policies that establish a compliance baseline that industry must meet. In turn, this enables standards bodies to have a better sense of how to build trust chains for authenticated communications in particular environments.

To this end, technical standardization is an important and valid tool for building compliance and achieving common knowledge and common understanding around emerging technologies. Furthermore, in the EU, codes of conduct and certification (article 40-43 GDPR) complement the toolkit. However, even though policymakers can establish benchmarks through regulations, it is still necessary to engage with industry and technical bodies in order to clarify ambiguities in the law. While standards can serve as benchmarks for DPAs and relevant authorities, there needs to be a relevant mechanism to ensure transparency in the auditing process and validate that market players are really respecting the standard.

Lessons Learned from Smart Cities and Transportation

Current discussions around automotive and transportation use-cases illustrate some of these lessons. When designing automotive technology to connect cars to smart devices, engineers have had to adjust to changing regulatory and market environments. Back in the mid-2000s, car designers relied on device identifiers to connect mobile phones to cars. But now that the environment has changed, engineers have come up with new ways to ensure connectivity without having to rely on identifying a particular user to a device.

As standards around identifiers continue to change, so too will the technology in automobiles. Widespread adoption in the market of new technologies is not just a technical challenge but also an industry challenge. Industry must be ready to embrace the new technical features, which means the question becomes one of measuring the demand needed to change industry behavior.

Policymakers can contribute to this through implementing regulatory initiatives and engaging with industry and technical bodies. For instance, in the EU, any processing of data around wifi and device identifiers needs to comply with the GDPR. In part, this baseline establishes a legal basis for smart cities to process mobility data and can serve as a target for standards development bodies in their activities.

For instance, Dutch Railways faced issues around data processing for crowd management purposes. While facilitating public transportation qualifies as a legitimate interest to process data under the GDPR, railway operators must check whether the way they process crowd data is really the best option and valid under the circumstance. Indeed, Dutch Railways faced many issues around crowd management and had options to enable both wifi and bluetooth tracking to process data, but chose to take a method that was less privacy intrusive than alternatives. They utilize(d) wifi tracking only in locations where absolutely necessary and trace travel patterns of commuters with an alternative method that masks the identities of the passengers.

Dutch Railways does this by hashing twice the WiFI-MAC address of the traveler’s phone through sensors in the railway station and sending the data to a central server. During the length of commute, subsequent sensors send hashed information to the same server which allows the company to match the data and generate a mobile pattern while preserving the identity of the commuter. The data is available for only one day before it is deleted which minimizes the risk of aggregating large data sets over time.

In addition, in the context of smart cities, it’s also important for policymakers and city coordinators to communicate effectively with the public. For instance, while there isn’t a clear legal standard for processing mobility data in the United States, public backlash about wifi tracking throughout cities serves as an incentive not only for policymakers to adopt more privacy-preserving trafficking solutions but also for industry to enable those solutions through product design. Organizations therefore need to engage with the public in a meaningful way to effectuate problem solving.

Figure 3 – Recommendations aimed at the use of MAC addresses in the context off Wi-Fi tracking (slide from the presentation by Marit Hansen).

 

Takeaways from the Panel Discussion

P802E defines a framework, a recommended practice for privacy considerations when designing or evaluating an IEEE 802 Standard. P802E does not provide a standard-specific set of rules or recommendations. The goal of the framework is to encourage designing for privacy, limiting (unauthorized) personal information exposure as much as possible. P802E can be used beyond 802 technologies, to inform how privacy may be affected by networking communications in the Data Link Layer (also known as L2-layer). The L2-layer is the second level in the seven-layer OSI reference model for network protocol design.

Different contexts (e.g., hospital, public Wi-Fi tracking, airport, smart city transportation platform) imply different privacy requirements which means that stakeholders need to communicate the privacy policies expectations for each vertical transparently. Mobile users expect a harmonized experience across contexts, which creates challenges for public sector actors, private industry, certification bodies, and technical standards bodies to strike the right balance between functionality and privacy.

When developing standards, SDOs should explore how such standards fit in regulatory regimes and the specific rules and definitions of personal data. Policymakers in turn must search for synergies to promote compliance with various legislative and regulatory rules and engage with standards bodies to ensure that stakeholders are on the same page. Such engagement should reflect a broad, interconnected way of thinking and routinely employ use cases to find the appropriate technical solution. To this end, it is also important that organizations involved in these workstreams know and appreciate the same terminology and frameworks.

Following such a path could help shorten the speed to market and ensure that technical standards and certification schemes are flexible and adaptive to the vast changes in technology that confront us.

 

The recording is available here.

To learn more about FPF in Europe, please visit fpf.org/eu. For information about Dataskydd.net click here.

Event Recap: Using Corporate Data for Research – Lessons from an Award-Winning Project

Last week, FPF hosted a virtual event honoring the winners of the first-ever FPF Award for Research Data Stewardship: University of California, Irvine Professor of Cognitive Sciences Mark Steyvers and Lumos Labs, represented by General Manager Bob Schafer. In addition to the awardees, the event featured Daniel L. Goroff, Vice President and Program Director at the Alfred P. Sloan Foundation, which funded the award, as well as FPF CEO Jules Polonetsky and FPF Policy Counsel Dr. Sara Jordan.

During the event, the participants outlined the importance of promoting privacy-protective data sharing collaborations between companies and academic researchers, described the specifics of the award-winning project and the results of the collaboration, steps taken to protect privacy throughout the process, advice for academics interested in working with company data, and advice for companies interested in working with academic researchers.

To learn more about the award-winning collaboration, please see the announcement and project fact sheet.

If you missed the broadcast, click the image below to watch it on YouTube:

Data Research Webinar

 

About the FPF Award for Research Data Stewardship

The first-of-its-kind award recognizes a research partnership between a company that has shared data with an academic institution in a privacy-protective manner, thereby driving the use of privately help data for academic research. When privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields, but that data is often unavailable due to a range of concerns, including the need to protect individual privacy.

To learn more about the award, application process, and reviewers, check out this past year’s Call for Nominations. Keep an eye out for next year’s Award for Research Data Stewardship Call for Nominations, coming this fall. You will be able to find information about nominations, and other FPF projects to promote responsible sharing of corporate data for research at FPF.org/data.

FPF Comments on Draft Washington Privacy Act of 2021

Yesterday, on September 30, 2020, FPF submitted comments regarding the draft Washington Privacy Act of 2021. The draft was released by Senator Carlyle, the Chair of the Washington State Senate Committee on Environment, Energy, and Technology (EET) on September 9, 2020.

The new version closely resembles last year’s Second Substitute version of the Washington Privacy Act of 2020 (SSB 6281), with a few changes that reflect House amendments from the previous legislative session. In addition, the new draft WPA contains two new sections that would regulate the collection and use of COVID-19-related data by both public and private entities. It is anticipated that the Act will be officially introduced in Washington State at the beginning of 2021.

In the midst of a global pandemic and in the absence of a baseline federal privacy law, we commend the WA legislature’s continued commitment to privacy. We have written previously, and reiterate in these comments, that the protections in the Washington Privacy Act would be a significant achievement for US privacy legislation. However, we have observed deep polarization in legislative debates in recent years, both in Washington and elsewhere. Given the challenges our country faces, the need to reach consensus is more important than ever, and we view meaningful legal privacy protections as both necessary and achievable.

To that end, FPF’s comments identify a number of relevant points of debate between stakeholders, along with practical suggestions for how they might be addressed in the context of the WPA. These include practical recommendations for: exemptions, including for scientific research; enforcement by the Attorney General; and the limitations of the opt-in and opt-out regimes. We are also supportive of the strong collection and use safeguards that Part 2 and 3 of the Act would place on data collected for the purposes of addressing the COVID-19 pandemic. 

If you’d like to discuss the WPA in more detail with the FPF team, please reach out to [email protected], [email protected], or [email protected]

Rob van Eijk Discusses Trends in European Privacy Discussions

We’re talking to FPF senior policy experts about their work on important privacy issues. Today, Rob van Eijk, FPF’s Managing Director for Europe, is sharing his perspective on FPF’s EU work, differences between U.S. and EU privacy frameworks, and more.

Prior to serving in his position as Managing Director for Europe at FPF, Rob worked at the Dutch Data Protection Authority (DPA) for nearly 10 years. He represented the Dutch DPA in international meetings and as a technical expert in court. He also represented the European Data Protection Authorities, assembled as the Article 29 Working Party, in the multi-stakeholder negotiations of the World Wide Web Consortium on Do Not Track. Rob is a privacy technologist with a PhD from Leiden Law School focusing on online advertising, specifically real-time bidding.

 

Tell us about yourself – what led you to be involved in privacy and FPF? 

I first got into privacy at the end of 2009 at a time when the retention time of data from Automatic Number Plate Recognition (ANPR)-cameras in the Netherlands were being actively discussed. Back then, I was doing contract work in computer forensics and project management. This was about a year after I had sold my company, BLAEU Business Intelligence BV – where I took care of the computer operations of small and medium sized enterprises – after running it successfully for nine years

While working as a contractor, I found out that the Dutch Data Protection Authority was looking for a technologist. Within three weeks of applying, I was part of the “Internet Team” that was charged with formal investigations into compliance with the Privacy Directive 95/46/EC in the private sector. At the Dutch DPA, my role was, to lead on-site inspections and collect evidence or to explain technology to people in the organization. I had a lot of fun in that role and stayed at the Dutch DPA for nearly 10 years.

The Dutch DPA made it possible for me to do a doctoral thesis alongside my work. Eventually, after finishing my PhD research, Jules invited me to present the results in an FPF Masterclass. It turned out that Jules and FPF were looking for someone on the ground in Europe, which led to my current role as FPF’s Managing Director for Europe.

How would you describe your role at FPF?

I am managing director of FPF operations in Europe, where we’re working to build a data protection community. Most of the work today is to ensure that everyone – particularly the Data Protection Authorities, academics, and civil society groups – understand the added value of a neutral platform that the Future of Privacy Forum provides. FPF is a non-profit membership organization with many companies that have unique privacy questions, whether related to ad-tech or new laws and technologies being developed.

Another aspect of my role is ensuring that we are a respected voice in the media. We do a lot of media outreach and engagement – for instance, addressing questions around the implications of RFID implants on what it really means to be human. I also explain to what extent there are implications of certain laws or technologies on the rights and freedoms of the individual, while being mindful of the different legislative frameworks under which companies operate. I see my role as one that is intended to guide both the Future of Privacy Forum and our member organizations through the most important privacy issues and questions, cooperating with the academics and regulators, and facilitating a neutral space for interesting, topical discussions.

The legal framework for privacy in Europe is different from the U.S. framework. Could you explain some of those differences and their impacts?

In Europe, we have a human rights-based data protection regulation framework, whereas the US laws are based on the notion of informational privacy as defined by Westin. In the EU, the human right to privacy is considered to be a real human right, and that thinking trickles down to the way that we talk about the concepts of freedom, privacy, security, and autonomy.

Informational privacy is focused on information control. In the EU, control is based, for example, around the protection of personal information and what is yours – protection of the integrity of your body, your phone, the integrity of the technology that’s in your connected car. There’s a lot of data being generated today and some of that data can be connected in such a way that it creates a comprehensive picture of your life, which needs protection. Thus, moving from the idea that privacy is a human right, we can create clear boundaries related to the context of information that must be protected, not just in principles like necessity or proportionality, but also in a societal context. The impact of certain types of contexts varies, as do the requirements for a legal basis: certain categories of data – like health data – require a high bar for informed consent, and other types of data – like biometric data – are prohibited from being processed, unless there is an exemption such as a clear law that enables certain processes.

What is the impact of different legal frameworks for a developing technology like artificial intelligence and machine learning?

AI and ML are interesting technological developments that show the implications of having different privacy frameworks in an interconnected world. Big questions around bias and discrimination in data via AI systems, and also the harms of these technologies, are top of mind in academic, policy, and economic discussions in Europe and reflect European priorities and thinking in each of those areas, but don’t necessarily reflect the thinking elsewhere. The outcomes of those discussions – both in Europe and elsewhere – will influence how AI/ML technology is developed and regulated in different parts of the world. It’s always important to keep in mind that technology is not developed in isolation here in Europe, so we are dependent on small groups of specialized companies that provide these technologies globally and must interact with a variety of jurisdictions and frameworks.

One of the consequences of having different legal frameworks around the world is the impact on innovation in certain markets. Different legal frameworks can create big complications in terms of compliance for companies and lead to a different use of technology. For instance, in the advertising space, we’ve seen that the information requirements have changed in relation to evolving privacy regulations in different regions, with different thresholds for consent. The consequence of those differences is that the EU experience of browsing websites is very different from the U.S. experience. In that way, it’s fascinating to see how the same advertising technology can lead to vastly different experiences. That is a concrete example of how technology shapes the society based on cultural values around privacy.

The impact of different legal frameworks was placed square in the center of the privacy debate when the Court of Justice of the European Union handed down its judgment in the Schrems II case. An important question in this debate is: how do we strike the right balance between the security of a state and the protection of its civilians, a legitimate interest for companies and its customers to benefit from big data, and the fundamental right to privacy and freedom of people?

You started as a tech expert and became a policy expert. What advantages does that give you? How is that helpful to the overall conversation about tech policy?

I studied electrical engineering because I really wanted to understand the world around us. Later, I got a master’s degree from the Leiden Institute of Advanced Computer Science. That technical background provided me with the ability to bridge hardware (understanding how information is collected at the hardware level) and software (specifically, how that information is translated to data by software), and then be able to follow the data flow to servers and platforms.

Being able to zoom in and out on the data helps in being able to be clear about policy questions and to hash out the real risks from a data privacy perspective. Then, once a consensus around the key risks and issues is developed, we can think about ways to mitigate those risks, not just in terms of minimization or prevention, but also understanding that certain risks can be positive and can create opportunities. From the policy perspective, it’s valuable to understand what a bottom-up, data-driven world actually means, what the data looks like, how software works, and how the components in software and hardware work.

What do you see happening next? Key topics you’ll be working on over the next year or so?

In Europe, we’re closely following the topics that are on the work program of the European Data Protection Board and the European Data Protection Supervisor in terms of guidance that they provide. We’re also closely following the 2020 work program of the European Commission.

One of those topics, cookies, is close to my heart. E-privacy has become an extremely important topic, as a number of different data-driven contexts are impacted by new privacy rules governing the use of connected technologies.

We also go beyond topics and issues that are connected to personal data – we address uses of machine-generated data that are not necessarily personal data in the legal sence. Therefore, it’s important that we track policy developments related to the free flow of non-personal data, which also happens to be an issue that is top of mind for the European Commission at the moment.

 

To learn more about FPF in Europe, please visit fpf.org/eu.