FPF, Highmark Health, and CMU Host Wired for Health: 2020 — Examining Biometric Technologies in the Age of COVID-19
On Thursday, October 8th, Highmark Health, Carnegie Mellon University’s CyLab Security and Privacy Institute, and Future of Privacy Forum hosted a virtual symposium—taking an in-depth look at the role of biometrics and privacy in the COVID-19 era.
During this virtual symposium, expert discussants and presenters examined the impact of biometrics and privacy in the ongoing fight against the novel coronavirus. The world has changed. Today, keeping COVID-19 at bay is a top priority across our communities, throughout our country, and around the world.
The full recording of the virtual symposium is available here:
https://www.youtube.com/watch?v=rrJsjBERsUE
Presentations focused on emerging technology, covering advanced facial recognition systems, temperature scanning, and respiratory disease detection. Presenters included Dr. Anil Singh of the Allegheny Health Network and Satya Venneti, CTO and Co-founder of Telling.ai; Dr. Marios Savvides of the CyLab Biometrics Center at Carnegie Mellon University; and Dr. Yang Cai of the Visual Intelligence Studio at Carnegie Mellon University.
The expert panel analyzed the privacy impacts of certain technologies, including the deployment of voice recognition and temperature sensing to identify disease symptoms associated with COVID-19. Panelists noted the inherent tensions within certain biometrics technologies, including the ethical and social dilemmas raised by public and private use.
Panel discussants included: Dr. Lorrie Faith Craon, Director and Bosch Distinguished Professor of the CyLab Security and Privacy Institute at Carnegie Mellon University; Dr. Lisa Martinelli, Chief Privacy and Data Ethics Officer, Highmark Health; Dr. Rachele Hendricks-Sturrup, Health Policy Counsel, Future of Privacy Forum; Kirk Nahra, Partner, WilmerHale; and Jules Polonetsky, CEO, Future of Privacy Forum.
We look forward to continuing this important conversation, with an in-person conference — Wired for Health 2021 — later next year.
FPF Submits Comments to United Nations Ahead of 2021 Special Report on Child Privacy
Last week, the Future of Privacy Forum (FPF) submitted comments to the United Nations Office of the High Commissioner for Human Rights Special Rapporteur on the right to privacy to inform the Special Rapporteur’s upcoming report on the privacy rights of children.
The Special Rapporteur’s report, expected in March 2021, will focus on how privacy affects the evolving capacity of the child and the growth of autonomy, and what factors enhance or constrain this development.
FPF’s comments focus on encouraging the Special Rapporteur to consider two key points in the development of their report, including:
How child privacy legislation can and should react to actual harms, and not unsubstantiated fears, in order to avoid unintended consequences that may impact the rights of children to benefit from and participate in the online ecosystem; and
How child privacy policies must consider and balance competing and evolving interests between children and other authority figures such as parents or teachers, and recognize the need to foster resilience and autonomy in children by helping them develop digital skills.
Additionally, FPF’s comments suggest that the Special Rapporteur’s report include a discussion on the need for schools, districts, and their third-party vendors to be transparent about data and technology use, storage, analysis, and purpose with children, parents, and other relevant stakeholders.
Transparency around data and technology use has become particularly urgent in recent months as millions of children around the world shifted to some form of online or distance education when the COVID-19 pandemic closed many school buildings in early 2020. Since that time, FPF has developed and compiled student privacy and COVID-19-related resources for school leaders, policymakers, teachers, and students and their families on its student privacy-focused website, StudentPrivacyCompass.org.
By sharing our expertise and insight on the US student privacy landscape and its long history of unintended consequences, the importance of balancing the interests of children with authority figures, and the critical need for fostering an environment of transparency and trust, FPF hopes to help inform a thorough and thoughtful report on the privacy rights of children by the Special Rapporteur next spring. We look forward to discussing these recommendations and others with child privacy stakeholders in the U.S. and around the world in the coming months.
Chelsey Colbert Discusses Trends in Mobility & Location Data
Chelsey Colbert, Policy Counsel, leads FPF’s mobility and location data portfolio. Prior to joining FPF, Chelsey was a lawyer at an international business law firm in Canada and was seconded as in-house privacy and data governance counsel to Sidewalk Labs, an Alphabet company that designs and builds urban innovations. Chelsey holds a J.D. with a major in technology law and policy from the University of Ottawa.
Can you tell us about your career and what led you to FPF?
While I was in law school, one of my professors, the late Ian Kerr, sparked my interest in AI and robotics and he has influenced my career path towards AI and robotics. After law school, I worked as in-house counsel for Sidewalk Labs while on secondment from a law firm. I’m really passionate about cutting edge technologies that blend the digital and physical worlds and there’s no better or more exciting space than mobility and smart cities for me to exercise that interest. I’m really optimistic about the potential for robots and automated vehicles to positively shape our future, particularly in terms of saving lives, increasing efficiencies, and providing humans with conveniences. At the same time, I recognize that there can be ethical and privacy harms to individuals and society through the irresponsible use of technologies. I believe there are ways to mitigate risks and reduce or eliminate harms to get as many benefits from technologies like automated vehicles and digital micromobility services, like bike- or scooter-sharing.
FPF has been a really great fit for me because I get to be deeply involved in some of the most exciting privacy and data-related challenges of our time. I enjoy getting to work with a variety of stakeholders, including government, industry, and academia. My work at FPF presents me with a unique opportunity to help shape our mobility future to be more equitable, accessible, safe, and affordable. I believe that the multitude of mobility options – from connected and autonomous vehicles (CAVs) to e-scooters to delivery robots – present many benefits and implications for society. My portfolio at FPF covers the entire range of privacy and data topics in the mobility space, and it’s rare to have a day go by where there isn’t something newsworthy coming out.
What attracted you to working on aspects of law related to AI, privacy, and mobility?
AI and privacy as a field is both niche and extremely broad. Some of the reasons why I’ve become interested in AI and privacy with respect to mobility specifically are that the technical and regulatory challenges of autonomous vehicles are fascinating and raise a lot of questions related to privacy and data protection. I’ve also found that mobility data is quite varied in nature and, because of that, raises really complex and contextual privacy challenges. The mobility ecosystem requires collaboration and input from the various levels of government, communities, companies, and international partners. Working with such a wide variety of stakeholders has been a highlight of my time at FPF.
What are the hot-button issues that you’re working on related to mobility?
Currently, I’m working with FPF stakeholders on a report that will outline privacy-by-design goalposts for connected and autonomous vehicles. The report will take a deep dive into connected and autonomous vehicle technologies that are essential for the safe development of these cars, including optical sensors, computer vision, and geolocation and HD mapping. I believe that just as safety and security should be built into the design of cars, so should privacy.
Mobility data sharing is another fascinating and dynamic area. One example is the sharing of micromobility data between companies and cities through the Mobility Data Specification (MDS). MDS is a set of open-source APIs that allows the data from a vehicle to be shared – almost in real-time – with city governments, typically the DOT. A big part of my work in this space is understanding the privacy implications of types of mobility data, which often includes personal data and location data. There are interesting questions about the temporal nature of mobility data, whether mobility data can be used to identify individuals, and the sharing of data between the private sector and the public sector, where each sector often has different obligations and responsibilities. Another fascinating development is the Right to Repair movement, which illustrates the potential conflicts of access to and control over vehicle data.
What mobility trends will we all be talking about in the next couple of years?
We’re seeing increased interest in the space from regulators across the globe, particularly in the larger markets. Regulators are concerned about safety regulations and standards. In the United States, for example, the National Highway Traffic Safety Administration recently announced an online, centralized platform for cities and companies to voluntarily post updates about their automated vehicle testing. After stalling in 2017, the SELF DRIVE Act was recently reintroduced, which would create a federal framework for the regulation of autonomous vehicles (AVs) and includes a section on privacy policies. We may see this or a new AV bill become law in 2021 and this will have implications for privacy and data protection.
There is also a lot of policy and legislative action in state and local governments in terms of mobility data sharing, access to mobility data, and general privacy laws that impact CAVs. Also, in Europe, the European Data Protection Board is expected to intensify its work in this area and recently released draft guidelines on connected cars. We’re also seeing advanced driver-assisted technologies and driver monitoring systems become required in car safety regulations and standards, which has implications for privacy and data protection. Regulations and standards are important from a technical perspective, because safety standards often drive the development of new technology. When manufacturers develop the technology to reach those standards, it’s important that companies (and governments) implement privacy by design in the technologies and in their organizational practices.
The volume of data coming from connected cars and other forms of mobility will increase, as will its value. There will be more opportunities to monetize this data, but it’s a really complex ecosystem with many players, all of which need to be strategic and thoughtful about privacy. We’re seeing the industry continue to move away from siloed operations, toward more collaboration, consolidation, and partnership in the mobility space as stakeholders – including car manufacturers, mapping companies, driverless technology companies and others – recognize the importance of collaboration to unlock the full potential of mobility data. This all makes for a really complex ecosystem, which means there is a lot for businesses, consumers, and policymakers to navigate. Some of the data is relatively benign from a privacy perspective, while other data is very sensitive, and some data could become personal and sensitive depending on the context, such as location data. Companies and governments collecting mobility data must ensure that their staff have a deep understanding of the privacy implications of new technologies and data types. Privacy-by-design and a cross-functional approach are some of the best ways to address these complex, multifaceted privacy issues.
I’m also expecting to see more public-private partnerships in this space. The private sector and all levels of government must have a relationship with each other to ensure that there is a constructive dialogue on privacy, safety, and technical standards. Municipal governments are an important part of the ecosystem, in addition to state and federal governments. Mobility data provides many social and monetary benefits and can also be used or misused in ways that have unintended harmful consequences for individuals and society. Companies and governments using and benefitting from mobility data should both be held to higher standards of privacy and data protection. The public should also have a voice in the policymaking and innovation process. Better transparency about how technology is being developed and the mitigation efforts to reduce harms could help improve consumer trust and the adoption of newer technologies such as delivery robots and automated driving features.
I expect the next several years in mobility law and policy to be very interesting and challenging. I consider myself to be very lucky to work in such a dynamic area of technology law and policy and am hopeful for a future with more (human friendly) robots in it.
FPF & Dataskydd.net Webinar – Privacy in High Density Crowd Contexts
Authors: Hunter Dorwart and Rob van Eijk
On 30 September 2020, Future of Privacy Forum (FPF) and Dataskydd.net jointly organized the webinar ‘Privacy in High Density Crowd Contexts’. A key aspect of the webinar was the role of industry-driven privacy standards in the development and deployment of privacy-friendly technologies for crowd management, mobile connectivity, and smart city services.
Keywords: IEEE P802E Recommendations, Privacy by Design, Certification, Standardization
Speakers (in alphabetical order)
Amelia Andersdotter (Dataskydd.net, Member of the European Parliament 2011-2014)
Jelena Burnik (Head of International Cooperation and Enforcement at the Information Commissioner of the Republic of Slovenia)
Jerome Henri (Cisco, Editor P802E)
Kelsey Finch (Future of Privacy Forum)
Marit Hansen (State Commissioner for Data Protection Schleswig-Holstein)
Rob van Eijk (Future of Privacy Forum)
Tiago Rodrigues (Wireless Broadband Alliance)
Udo Oelen (Chief Privacy Officer at Dutch Railways, former Head of Private Sector at the Dutch Data Protection Authority)
Privacy-Preserving Technologies and Standards Development
Many industry-driven standards development organizations (SDOs) have been working to improve the general privacy qualities of both network infrastructures and web infrastructures. Generally speaking, these bodies focus on how industry standards can remedy design flaws in technologies in order to facilitate transparency, foreseeability, and the ability of consumers to both opt out and make decisions about the way they can interact with technical infrastructures.
Some of these standards have been quite successful and continue to show a lot of promise to address emerging privacy issues in technology. For instance, the Internet Engineering Task Force (IETF), which is responsible for the bulk of what consumers see in a web browser or in an email, considers privacy issues when developing Internet standards through, among other things, the RFC6973 Privacy Considerations for Internet Protocols. In addition, many technical bodies are attempting to introduce encryption and minimization standards to ensure that network protocols do not generate identifiers beyond what is necessary for the network to function.
As the world becomes more digitized, SDOs will continue to develop privacy-preserving standards. However, challenges are beginning to emerge regarding the widespread adoption of these standards and how industry-setting bodies such as Internet Engineering Task Force (IETF), the Institute of Electrical and Electronics Engineers (IEEE), 3rd Generation Partnership Program (3GPP), and the World Wide Web Consortium (W3C) interface with EU-level policymaking. As it is, there is a real risk that the underlying policy goals of governments may conflict with future standards setting and that lack of coordination across SDOs and between governments may hinder consistency and interoperability.
Striking a balance between protocols that need some type of identifier to function and the level of privacy exposure that might result has created difficulties for embedding privacy within standards. In order to enable seamless communication between routers and devices, protocols that govern access points within the routers must trace identifiers unique to a product device. Because devices are often linked to an individual user, the network protocols that facilitate communication invariably expose data about these users. The IEEE navigates this trade-off by providing technical solutions to help developers make this seamless flow of communication and data-sharing more aligned with public policy goals.
IEEE Recommended Practices for Privacy Considerations for IEEE 802 Technologies
To this end, the IEEE P802E working group has been drafting Recommended Practices for Privacy Considerations for IEEE 802 Technologies (P802E). P802E contains recommendations and checklists for IEEE 802 technologies developers (Figure 1). The approach builds on Section 7 questionnaires of RFC6973 Privacy Considerations for Internet Protocols adapted to the IEEE 802 environment which considers harms and risks to privacy when developing network protocols.
Figure 1 – Overview of P802E applications (slide from the presentation by Jerome Henry)
The purpose of the P802E recommendation is ‘to promote a consistent approach by IEEE 802 protocol developers to mitigate privacy threats identified in the specified privacy threat model and provide a privacy guideline.’ In order to strike the right balance between functionality and privacy, the IEEE focuses on the context of device use. For instance, personal devices make it easier to identify the user through network traffic routing while shared devices generally do not. The rubric for developing standards therefore changes depending on how users will interface with the device.
IEEE 802 LAN standards specify the operation of media access control (MAC) methods and protocols that support frame-based network communication. MAC procedures and various protocol frame formats and fields can be used to identify personal devices, their attributes, and their use to support specific networking applications and activities. An adversary can use this information to obtain (location) information about an individual. Other possible threats in IEEE 802 LAN standards are, e.g., flow identifiers, optional fields in the standard, network discovery flows and patterns, ranging exchanges, authentication flows, directed queries, frame timing, and frame structure. Figure 2 illustrates the threat of location services systematically tracking (mobile) access points.
Figure 2 – Example of the threat of location services systematically tracking (mobile) access points (slide from the presentation by Marit Hansen).
Speed to Market
As standards evolve, policymakers and industry must work together to minimize the speed to market. This includes facilitating the adoption of privacy by design and establishing appropriate benchmarks for matching regulatory compliance with technological design. As it is, one major challenge facing public-private partnerships is how fast the technological landscape changes. Putting in place the infrastructure to adapt standards to novel privacy challenges becomes difficult as more IoT devices proliferate throughout the economy and make tracking scenarios much larger than before.
Standardization and Certification
From the perspective of industry, many stakeholders in the broadband industry see the importance of implementing privacy policies throughout their services and recognize the demand from policymakers. Like standards bodies, network operators must also strike a balance between functionality and privacy-preserving practices. But this doesn’t have to be a zero-sum tradeoff. Organizations can push for policies that establish a compliance baseline that industry must meet. In turn, this enables standards bodies to have a better sense of how to build trust chains for authenticated communications in particular environments.
To this end, technical standardization is an important and valid tool for building compliance and achieving common knowledge and common understanding around emerging technologies. Furthermore, in the EU, codes of conduct and certification (article 40-43 GDPR) complement the toolkit. However, even though policymakers can establish benchmarks through regulations, it is still necessary to engage with industry and technical bodies in order to clarify ambiguities in the law. While standards can serve as benchmarks for DPAs and relevant authorities, there needs to be a relevant mechanism to ensure transparency in the auditing process and validate that market players are really respecting the standard.
Lessons Learned from Smart Cities and Transportation
Current discussions around automotive and transportation use-cases illustrate some of these lessons. When designing automotive technology to connect cars to smart devices, engineers have had to adjust to changing regulatory and market environments. Back in the mid-2000s, car designers relied on device identifiers to connect mobile phones to cars. But now that the environment has changed, engineers have come up with new ways to ensure connectivity without having to rely on identifying a particular user to a device.
As standards around identifiers continue to change, so too will the technology in automobiles. Widespread adoption in the market of new technologies is not just a technical challenge but also an industry challenge. Industry must be ready to embrace the new technical features, which means the question becomes one of measuring the demand needed to change industry behavior.
Policymakers can contribute to this through implementing regulatory initiatives and engaging with industry and technical bodies. For instance, in the EU, any processing of data around wifi and device identifiers needs to comply with the GDPR. In part, this baseline establishes a legal basis for smart cities to process mobility data and can serve as a target for standards development bodies in their activities.
For instance, Dutch Railways faced issues around data processing for crowd management purposes. While facilitating public transportation qualifies as a legitimate interest to process data under the GDPR, railway operators must check whether the way they process crowd data is really the best option and valid under the circumstance. Indeed, Dutch Railways faced many issues around crowd management and had options to enable both wifi and bluetooth tracking to process data, but chose to take a method that was less privacy intrusive than alternatives. They utilize(d) wifi tracking only in locations where absolutely necessary and trace travel patterns of commuters with an alternative method that masks the identities of the passengers.
Dutch Railways does this by hashing twice the WiFI-MAC address of the traveler’s phone through sensors in the railway station and sending the data to a central server. During the length of commute, subsequent sensors send hashed information to the same server which allows the company to match the data and generate a mobile pattern while preserving the identity of the commuter. The data is available for only one day before it is deleted which minimizes the risk of aggregating large data sets over time.
In addition, in the context of smart cities, it’s also important for policymakers and city coordinators to communicate effectively with the public. For instance, while there isn’t a clear legal standard for processing mobility data in the United States, public backlash about wifi tracking throughout cities serves as an incentive not only for policymakers to adopt more privacy-preserving trafficking solutions but also for industry to enable those solutions through product design. Organizations therefore need to engage with the public in a meaningful way to effectuate problem solving.
Figure 3 – Recommendations aimed at the use of MAC addresses in the context off Wi-Fi tracking (slide from the presentation by Marit Hansen).
Takeaways from the Panel Discussion
P802E defines a framework, a recommended practice for privacy considerations when designing or evaluating an IEEE 802 Standard. P802E does not provide a standard-specific set of rules or recommendations. The goal of the framework is to encourage designing for privacy, limiting (unauthorized) personal information exposure as much as possible. P802E can be used beyond 802 technologies, to inform how privacy may be affected by networking communications in the Data Link Layer (also known as L2-layer). The L2-layer is the second level in the seven-layer OSI reference model for network protocol design.
Different contexts (e.g., hospital, public Wi-Fi tracking, airport, smart city transportation platform) imply different privacy requirements which means that stakeholders need to communicate the privacy policies expectations for each vertical transparently. Mobile users expect a harmonized experience across contexts, which creates challenges for public sector actors, private industry, certification bodies, and technical standards bodies to strike the right balance between functionality and privacy.
When developing standards, SDOs should explore how such standards fit in regulatory regimes and the specific rules and definitions of personal data. Policymakers in turn must search for synergies to promote compliance with various legislative and regulatory rules and engage with standards bodies to ensure that stakeholders are on the same page. Such engagement should reflect a broad, interconnected way of thinking and routinely employ use cases to find the appropriate technical solution. To this end, it is also important that organizations involved in these workstreams know and appreciate the same terminology and frameworks.
Following such a path could help shorten the speed to market and ensure that technical standards and certification schemes are flexible and adaptive to the vast changes in technology that confront us.
To learn more about FPF in Europe, please visitfpf.org/eu. For information about Dataskydd.net click here.
Event Recap: Using Corporate Data for Research – Lessons from an Award-Winning Project
Last week, FPF hosted a virtual event honoring the winners of the first-ever FPF Award for Research Data Stewardship: University of California, Irvine Professor of Cognitive Sciences Mark Steyvers and Lumos Labs, represented by General Manager Bob Schafer. In addition to the awardees, the event featured Daniel L. Goroff, Vice President and Program Director at the Alfred P. Sloan Foundation, which funded the award, as well as FPF CEO Jules Polonetsky and FPF Policy Counsel Dr. Sara Jordan.
During the event, the participants outlined the importance of promoting privacy-protective data sharing collaborations between companies and academic researchers, described the specifics of the award-winning project and the results of the collaboration, steps taken to protect privacy throughout the process, advice for academics interested in working with company data, and advice for companies interested in working with academic researchers.
The first-of-its-kind award recognizes a research partnership between a company that has shared data with an academic institution in a privacy-protective manner, thereby driving the use of privately help data for academic research. When privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields, but that data is often unavailable due to a range of concerns, including the need to protect individual privacy.
To learn more about the award, application process, and reviewers, check out this past year’s Call for Nominations. Keep an eye out for next year’s Award for Research Data Stewardship Call for Nominations, coming this fall. You will be able to find information about nominations, and other FPF projects to promote responsible sharing of corporate data for research at FPF.org/data.
FPF Comments on Draft Washington Privacy Act of 2021
Yesterday, on September 30, 2020, FPF submitted comments regarding the draft Washington Privacy Act of 2021. The draft was released by Senator Carlyle, the Chair of the Washington State Senate Committee on Environment, Energy, and Technology (EET) on September 9, 2020.
The new version closely resembles last year’s Second Substitute version of the Washington Privacy Act of 2020 (SSB 6281), with a few changes that reflect House amendments from the previous legislative session. In addition, the new draft WPA contains two new sections that would regulate the collection and use of COVID-19-related data by both public and private entities. It is anticipated that the Act will be officially introduced in Washington State at the beginning of 2021.
In the midst of a global pandemic and in the absence of a baseline federal privacy law, we commend the WA legislature’s continued commitment to privacy. We have written previously, and reiterate in these comments, that the protections in the Washington Privacy Act would be a significant achievement for US privacy legislation. However, we have observed deep polarization in legislative debates in recent years, both in Washington and elsewhere. Given the challenges our country faces, the need to reach consensus is more important than ever, and we view meaningful legal privacy protections as both necessary and achievable.
To that end, FPF’s comments identify a number of relevant points of debate between stakeholders, along with practical suggestions for how they might be addressed in the context of the WPA. These include practical recommendations for: exemptions, including for scientific research; enforcement by the Attorney General; and the limitations of the opt-in and opt-out regimes. We are also supportive of the strong collection and use safeguards that Part 2 and 3 of the Act would place on data collected for the purposes of addressing the COVID-19 pandemic.
Read FPF’s comments regarding the Washington Privacy Act of 2021 here.
Read the full text of the Washington Privacy Act of 2021 here.
Read a useful overview of the Washington Privacy Act of 2021 released by Sen. Carlyle here.
Read about FPF’s participation in July’s Washington State Senate EET Committee Public Work Session on government uses of data and contact tracing here.
Read FPF’s earlier comparison of last year’s Washington Privacy Act of 2020, the CCPA, the CPRA, and the GDPR here.
Rob van Eijk Discusses Trends in European Privacy Discussions
We’re talking to FPF senior policy experts about their work on important privacy issues. Today, Robvan Eijk, FPF’s Managing Director for Europe, is sharing his perspective on FPF’s EU work, differences between U.S. and EU privacy frameworks, and more.
Prior to serving in his position as Managing Director for Europe at FPF, Rob worked at the Dutch Data Protection Authority (DPA) for nearly 10 years. He represented the Dutch DPA in international meetings and as a technical expert in court. He also represented the European Data Protection Authorities, assembled as the Article 29 Working Party, in the multi-stakeholder negotiations of the World Wide Web Consortium on Do Not Track. Rob is a privacy technologist with a PhD from Leiden Law School focusing on online advertising, specifically real-time bidding.
Tell us about yourself – what led you to be involved in privacy and FPF?
I first got into privacy at the end of 2009 at a time when the retention time of data from Automatic Number Plate Recognition (ANPR)-cameras in the Netherlands were being actively discussed. Back then, I was doing contract work in computer forensics and project management. This was about a year after I had sold my company, BLAEU Business Intelligence BV – where I took care of the computer operations of small and medium sized enterprises – after running it successfully for nine years
While working as a contractor, I found out that the Dutch Data Protection Authority was looking for a technologist. Within three weeks of applying, I was part of the “Internet Team” that was charged with formal investigations into compliance with the Privacy Directive 95/46/EC in the private sector. At the Dutch DPA, my role was, to lead on-site inspections and collect evidence or to explain technology to people in the organization. I had a lot of fun in that role and stayed at the Dutch DPA for nearly 10 years.
The Dutch DPA made it possible for me to do a doctoral thesis alongside my work. Eventually, after finishing my PhD research, Jules invited me to present the results in an FPF Masterclass. It turned out that Jules and FPF were looking for someone on the ground in Europe, which led to my current role as FPF’s Managing Director for Europe.
How would you describe your role at FPF?
I am managing director of FPF operations in Europe, where we’re working to build a data protection community. Most of the work today is to ensure that everyone – particularly the Data Protection Authorities, academics, and civil society groups – understand the added value of a neutral platform that the Future of Privacy Forum provides. FPF is a non-profit membership organization with many companies that have unique privacy questions, whether related to ad-tech or new laws and technologies being developed.
Another aspect of my role is ensuring that we are a respected voice in the media. We do a lot of media outreach and engagement – for instance, addressing questions around the implications of RFID implants on what it really means to be human. I also explain to what extent there are implications of certain laws or technologies on the rights and freedoms of the individual, while being mindful of the different legislative frameworks under which companies operate. I see my role as one that is intended to guide both the Future of Privacy Forum and our member organizations through the most important privacy issues and questions, cooperating with the academics and regulators, and facilitating a neutral space for interesting, topical discussions.
The legal framework for privacy in Europe is different from the U.S. framework. Could you explain some of those differences and their impacts?
In Europe, we have a human rights-based data protection regulation framework, whereas the US laws are based on the notion of informational privacy as defined by Westin. In the EU, the human right to privacy is considered to be a real human right, and that thinking trickles down to the way that we talk about the concepts of freedom, privacy, security, and autonomy.
Informational privacy is focused on information control. In the EU, control is based, for example, around the protection of personal information and what is yours – protection of the integrity of your body, your phone, the integrity of the technology that’s in your connected car. There’s a lot of data being generated today and some of that data can be connected in such a way that it creates a comprehensive picture of your life, which needs protection. Thus, moving from the idea that privacy is a human right, we can create clear boundaries related to the context of information that must be protected, not just in principles like necessity or proportionality, but also in a societal context. The impact of certain types of contexts varies, as do the requirements for a legal basis: certain categories of data – like health data – require a high bar for informed consent, and other types of data – like biometric data – are prohibited from being processed, unless there is an exemption such as a clear law that enables certain processes.
What is the impact of different legal frameworks for a developing technology like artificial intelligence and machine learning?
AI and ML are interesting technological developments that show the implications of having different privacy frameworks in an interconnected world. Big questions around bias and discrimination in data via AI systems, and also the harms of these technologies, are top of mind in academic, policy, and economic discussions in Europe and reflect European priorities and thinking in each of those areas, but don’t necessarily reflect the thinking elsewhere. The outcomes of those discussions – both in Europe and elsewhere – will influence how AI/ML technology is developed and regulated in different parts of the world. It’s always important to keep in mind that technology is not developed in isolation here in Europe, so we are dependent on small groups of specialized companies that provide these technologies globally and must interact with a variety of jurisdictions and frameworks.
One of the consequences of having different legal frameworks around the world is the impact on innovation in certain markets. Different legal frameworks can create big complications in terms of compliance for companies and lead to a different use of technology. For instance, in the advertising space, we’ve seen that the information requirements have changed in relation to evolving privacy regulations in different regions, with different thresholds for consent. The consequence of those differences is that the EU experience of browsing websites is very different from the U.S. experience. In that way, it’s fascinating to see how the same advertising technology can lead to vastly different experiences. That is a concrete example of how technology shapes the society based on cultural values around privacy.
The impact of different legal frameworks was placed square in the center of the privacy debate when the Court of Justice of the European Union handed down its judgment in the Schrems II case. An important question in this debate is: how do we strike the right balance between the security of a state and the protection of its civilians, a legitimate interest for companies and its customers to benefit from big data, and the fundamental right to privacy and freedom of people?
You started as a tech expert and became a policy expert. What advantages does that give you? How is that helpful to the overall conversation about tech policy?
I studied electrical engineering because I really wanted to understand the world around us. Later, I got a master’s degree from the Leiden Institute of Advanced Computer Science. That technical background provided me with the ability to bridge hardware (understanding how information is collected at the hardware level) and software (specifically, how that information is translated to data by software), and then be able to follow the data flow to servers and platforms.
Being able to zoom in and out on the data helps in being able to be clear about policy questions and to hash out the real risks from a data privacy perspective. Then, once a consensus around the key risks and issues is developed, we can think about ways to mitigate those risks, not just in terms of minimization or prevention, but also understanding that certain risks can be positive and can create opportunities. From the policy perspective, it’s valuable to understand what a bottom-up, data-driven world actually means, what the data looks like, how software works, and how the components in software and hardware work.
What do you see happening next? Key topics you’ll be working on over the next year or so?
In Europe, we’re closely following the topics that are on the work program of the European Data Protection Board and the European Data Protection Supervisor in terms of guidance that they provide. We’re also closely following the 2020 work program of the European Commission.
One of those topics, cookies, is close to my heart. E-privacy has become an extremely important topic, as a number of different data-driven contexts are impacted by new privacy rules governing the use of connected technologies.
We also go beyond topics and issues that are connected to personal data – we address uses of machine-generated data that are not necessarily personal data in the legal sence. Therefore, it’s important that we track policy developments related to the free flow of non-personal data, which also happens to be an issue that is top of mind for the European Commission at the moment.
To learn more about FPF in Europe, please visit fpf.org/eu.
FPF Testifies at FTC Data Portability Workshop
Yesterday, on September 22, 2020, the Federal Trade Commission held a public workshop, “Data To Go,” examining the benefits and challenges of data portability frameworks for consumers and competition. As a panelist during the first discussion, FPF’s Gabriela Zanfir-Fortuna discussed: how data portability operates in different commercial sectors; lessons learned from the GDPR and other global laws; and observations on the dual nature of data portability, as both a means to facilitate competition and a right of individuals to exercise control over their data.
Watch the recording (FPF’s Gabriela Zanfir-Fortuna’s remarks begin at 01:16),
Read comments submitted from 25+ stakeholders in advance of the workshop (see the full comments here).
The day-long workshop featured a wide range of privacy advocates, academics, government regulators, economists, and other experts. Below we provide key highlights from the workshop’s four panels: (1) Data Portability initiatives in the European Union, California, and India, (2) financial and health portability regimes, (3) reconciling the benefits and risks of data portability, and (4) realizing data portability’s potential: material challenges and solutions.
Panel 1: Data Portability Initiatives in the European Union, California, and India
FPF’s Gabriela Zanfir-Fortuna served as a panelist during Panel 1 of the workshop, discussing lessons learned from the European Union’s General Data Protection Regulation (GDPR), the California Consumer Privacy Act (CCPA), and other data portability initiatives in India, Brazil, and Singapore. Other panelists included Inge Graef (Tilburg University), Rahul Mattan (Trilegal India), Stacey D. Schesser (Office of the California Attorney General), and Karolina Mojzesowics (European Commission), and the panel was moderated by Guilherme Roschke (FTC).
Panelists agreed that the GDPR and the CCPA both conceptualize data portability as a right of the data subject, underpinned by the idea that individuals should have control over how personal data is collected and used. Panelists also agreed that portability plays an important role in competition, with Inge Graef noting the unique role in removing barriers to entry for data-driven startups. This is being reflected more and more in policy documents in the EU. Rahul Mattan shared details about the complex and innovative framework that is currently being set up in India to allow portability and interoperability in the financial services sector.
Gabriela Zanfir-Fortuna added that many complex questions remain for how to make portability workable in practice, including: authentication and verification; personal data of the portability requestor that also includes personal data of third parties (such as in photographs or conversations); risks and responsibilities when porting data to services with weaker security or privacy protections; and other downstream uses of data, as reflected by the debates at the intersection of the Payment Services Directive 2 and the GDPR.
Panelists also discussed and mostly agreed on key similarities and differences between the GDPR and the CCPA. The right of data portability is more limited and nuanced under the GDPR, excluding “inferences” from its scope. Unlike the GDPR, the CCPA also combines the right of access and portability — requiring personal data to be provided in a portable format following an access request. One of the key points emphasized by both Gabriela and Karolina Mojzesowics was that the experience of more than two years of the GDPR shows that the right to data portability is not used at its full potential, with very few requests being made to organizations. This is an issue that the European Commission intends to work on, increasing awareness, but also through initiatives to make portability practical.
As CCPA enforcement begins, Stacey Schesser indicated that the Regulations adopted by the California Office of the Attorney General aimed to provide greater clarity on authentication and verification mechanisms, to ensure that the “right to know” will not lead to unauthorized access to data. In addition, she mentioned that California’s Ballot Initiative (Proposition 24) may change the legal requirements if passed, as it would no longer explicitly refer to data portability, but still require data in subject access requests to be made available in a machine-readable format (an implied right to portability).
Panel 2: Financial and Health Portability Regimes (Case Studies)
Panel 2, moderated by Katherine White (FTC), explored case studies from the financial and health care sectors, including new health IT rules from the U.S. Office of the National Coordinator of Health Information Technology (ONC) and the UK’s Open Banking Initiative. Panelists from both the financial and healthcare sectors discussed the growing role of data portability in each sector, agreeing that consumer trust remains an important underlying issue for both.
In the healthcare industry, panelists remarked on the trend of data portability being used to improve individual access to their medical records. Dan Horbatt (Particle Health) discussed some of the technological and economic barriers to embedding data portability, and remarked that the biggest trend is towards more seamlessness in communicating patient permissions for collating medical records. Dan Rucker (US Department of Health and Human Services) remarked that individual data portability has long been a goal of HIPAA. Following the recent newly mandated US healthcare interoperability rules, Rucker envisions the proliferation of new apps to facilitate portability in the next few years, as well as more opportunities for the Internet of Things (IoT). Rucker highlighted the ongoing need for standardized tools to facilitate interoperability and portability for medical records.
In the financial sector, panelists discussed Open Banking efforts in the United States and the UK. Open Banking refers to the use of open Application Programming Interfaces (APIs) to enable third-party financial service providers to access consumer transactions and other financial data from banks through new financial apps and services. According to Bill Roberts (UK Competition and Markets Authority), the UK’s Open Banking Rules were driven by a desire to address competition and promote an emerging fintech industry. Ongoing challenges remain with identification and authentication mechanisms, and Roberts noted that many banks are increasingly turning to biometric methods for authentication. Michael S. Barr (University of Michigan) observed that the UK, Singapore, India, and Australia have all made progress in Open Banking to improve user control over financial information, and to increase market competition. Although the US lags behind the U.K. in implementing open banking rules, Barr believes there is huge potential for consumers and greater competition.
Panel 3: Reconciling the Benefits and Risks of Data Portability
In Panel 3, moderated by Ryan Quillian (FTC), panelists discussed the benefits and risks of data portability with an eye toward the twin aims of protecting consumers and promoting competition. Panelists agreed that data portability has a range of policy goals, including individual autonomy, access to data, and the broader societal promotion of consumer welfare, innovation, and competition, with panelists offering individual remarks: global initiatives (such as the Data Transfer Project); the sector-specific nature of portability; or particular risks arising from lack of uniform regulation in the United States.
Ali Lange (Google) described the privacy and security protections in place for two existing data portability tools: Google Takeout, for individual exports and transfers of data across Google services; and the open-source Data Transfer Project, which is an open source initiative of big online platforms and service providers, including Apple, Microsoft, Google, Facebook and Twitter, which facilitates service-to-service data transfers.
Several commenters offered perspectives on the sector-specific nature of portability. For example, Gabriel Nicholas (New York University) commented that the competitive effect of data portability is not uniform across sectors, arguing that the FTC should use alternative methods to promote competition. One key point he made was about “group portability”, and how portability could be very useful as a group function or even a group right, for example allowing a group of friends to move from a platform to another. Hodan Omaar (Center for Data Innovation) explained that there is a greater need for data portability in sectors where there is a greater disparity between the goals of organizations that hold data and the intent of the data subjects.
Meanwhile, Pam Dixon (World Privacy Forum) outlined risks associated with onward transfers. Particularly in the healthcare sector, she observed that regulatory gaps exist in the United States for non-HIPAA covered entities. In contrast, she noted that the EU’s GDPR provides baseline data protection rules to prevent regulatory gaps. Peter Swire (Georgia Tech) argued in favor of keeping the theory of data portability in check with practical scenarios, pointing out that studying use cases in detail can be very helpful to understanding the limits and benefits of portability. For example, Professor Swire noted the concept of “multihoming” for data, a way to describe the fact that very often in practice portability is not about moving the data altogether from one provider to another, but simply about transferring a copy of the data to another service that the individual would like to try out. Thus, data portability might help create a place for users to have multiple “homes.”
Panel 4: Realizing Data Portability’s Potential: Material Challenges and Solutions
Panel 4, moderated by Jared Brown (FTC), considered the challenges of data security, privacy, standardization, and interoperability with a range of industry representatives (Mastercard, Digi.me), civil society (Mission:data Coalition), and consumer privacy advocates (Public Knowledge and Electronic Frontier Foundation).
Overall, panelists agreed on the need to: promote trust among individuals; provide individuals with greater control over their data; and protect and use consumer data responsibly. According to Erika Brown Lee (Mastercard), a consumer-centric approach to data portability involves basing the data transfers between services on consumer requests (i.e., consent), and reducing friction for consumers. Brown Lee also noted that security is the critical challenge for companies offering data portability, referring to verification and authentication in particular. Security risks also vary depending on how data portability is implemented, with panelists agreeing that utilizing APIs for data transfers is generally better from a security perspective than the widespread practice of credential sharing (“screen scraping”). Julian Ranger noted that screen scraping has a greater potential for abuse, and Michael Murray (Mission:data coalition) referred to screen scraping as expensive, buggy, and inconsistent.
Consumer privacy advocates emphasized the need for greater privacy protections for individuals and accountability for companies. Sara Collins (Public Knowledge) emphasized the important need for comprehensive federal privacy legislation to close regulatory gaps, and to set minimum standards. In addition, Bennett Cyphers (EFF) stated that to solve many of wider competition issues in the technology sector, regulation in areas other than data portability could incentivize data sharing. Discussing how responsibility and liability for downstream data should be allocated in the data portability context, Cyphers stated that liability should rest with an actor responsible for wrongdoing. However, in cases where data is shared without the consumer’s knowledge, he stated that liability ought to rest with the data transferrer.
Moving Forward
Many complex questions remain in terms of how to make data portability workable for organizations to implement in practice. In his keynote remarks for the FTC workshop, Peter Swire (FPF Senior Fellow) discussed his recently published article, “Portability and Other Transfers Impact Assessment (PORT-IA).” The Impact Assessment aims to provide a framework for multi-disciplinary experts to use in a particular data transfer context to assess issues of data portability, including what Swire refers to as “Other Required Transfers” (e.g., a transfer of an entire database). Due to the sectoral nature of the issues that arise involving data portability, there is unlikely to be a one-size-fits-all solution. However, as legal regimes and policymakers around the world increasingly conceptualize and implement portability and interoperability regimes, data portability tools and commercial practices will continue to develop in the data ecosystem.
Thank you to Kai Koppoe, Hunter Dorwart and Veronica Alix for their contribution to this blog.
The First National Model Student Data Privacy Agreement Launches
Protections for student data privacy took an important step forward this summer when the Student Data Privacy Consortium (SDPC)released the first modelNational Data Privacy Agreement(NDPA) for school districts to use with their technology service providers. Ever since education technology (edtech) emerged as a key tool in classrooms, both schools and edtech companies have struggled to create data privacy agreements (DPAs) that adequately protect student data and meet both schools’ and providers’ needs. DPAs provide crucial protections for student data by limiting its use and sharing. A key challenge in that process is that US federal student privacy law and manystate laws require specific contractual clauses or protections. The new NDPA addresses this challenge by streamlining the education contracting process and, in the SDPC’s words, establishing “common expectations between schools/districts and marketplace providers.”
In this blog, we outline some of the major contracting challenges, the SDPC’s creation of the model contract, and the ways in which education stakeholders can use the NDPA to reduce the burden of contracting.
Data Privacy Agreements: Challenges for Schools and Edtech Companies
Schools and districts typically use hundreds of companies to provide services every school year. Edtech companies perform widely varying tasks for schools and districts, such as data storage, educational games, learning management systems, attendance tracking, and many other school functions. The privacy protections that each company must implement can vary based on the type and sensitivity of student data they hold and how it is collected, used, or shared. In addition, as43 states have passed more than 130 student privacy laws in the past five years, districts and edtech companies must incorporate significant legal obligations into their agreements or contracts. Districts also want to ensure that companies limit their student data collection and use.
However, contracting individually with each service provider to ensure this protection is often extremely difficult for both districts and companies. Each DPA might require different levels of sharing, control, and protections regarding student data. And few districts can hire teams of technologists and lawyers to understand and negotiate student data privacy with each of their service providers. As a result, districts across the country have sought ways to ensure trust and facilitate the DPA process.
Edtech companies face similar challenges as they bring important services to schools and districts. Because contracts involving student data must align with federal and state laws and address each district’s needs and concerns, edtech companies find it challenging and expensive to negotiate DPAs with school districts at scale. Often, schools and districts require DPAs that are unique to their institution but might contain requirements that are not appropriate for every edtech company; for example, a company that deletes data after it is no longer needed (to improve data security) would be unable to comply with a contractual requirement to return all student data after the company provides its service. Larger edtech companies might have teams of lawyers who can work with clients on these issues, but many smaller companies without legal departments may sign contracts they know they cannot observe because they lack the resources or leverage to incorporate language that fits the realities of their business.
These challenges have forced both school districts and companies to dedicate significant time and resources to DPAs each year. To alleviate schools’ contracting challenges and strengthen their knowledge and bargaining power with edtech companies, a group of Massachusetts school districts created the SDPC in 2015. The organization now has school district alliances in 29 states.
How the SDPC Developed the NDPA
SDPC’s alliances have created influential state-wide agreements in several states, such as California and Massachusetts. In recent years, other state alliances have increasingly adapted versions of the California and Massachusetts agreements in their own states to ease the burdens of contracting with their service providers. As a result of the growing popularity of the state-level DPAs and calls to establish baseline agreements for its alliances, SDPC formed a working group to create a national set of standards that align as much as possible with current laws, requirements, and edtech business practices.
For more than a year, the SDPC working group––made up of attorneys, district officials, company representatives, and nonprofit organizations, including FPF––met to create the NDPA. The goal was to improve the language from the state agreements and to create a balanced starting point for negotiations between schools and companies nationwide. The result is a contract that will likely save time and money for schools, districts, and edtech companies, ultimately benefiting students by allowing schools and districts to allocate more resources to learning and less to negotiating.
The working group attempted to balance substantive changes and create a structure that makes the contract easier to use than past state-level versions. The group addressed provisions regarding data breaches, sub-processor restrictions, advertising limits, de-identified data use, and other issues, reflecting multiple compromises between districts and companies. The working group also added sections that will allow districts and edtech companies to easily include information that was challenging in past DPAs, such as descriptions of companies’ services, terms that address state law, and changes to standard terms. Over the next year, SDPC’s state-level alliances will create state-specific clauses allowing them to incorporate unique requirements from their state laws into the NDPA.
Ongoing Revision and Stakeholder Participation
The SDPC’s NDPA contains terms that align with most US state and federal student privacy laws. Nonetheless, not every provision in the agreement will be perfect for every school, district, or company. Districts and companies concerned about specific provisions will still benefit from using the NDPA because it will set a common starting point for districts’ and companies’ discussions of data privacy.
While a significant step forward, the NDPA has room for improvement. No matter how well-vetted, a resource such as this can cause unintended consequences, and stakeholders should not hesitate to identify and communicate them. For example, transactional challenges remain regarding how the NDPA will fit within the structure of a service agreement between schools and companies, especially since the NDPA has some overlapping and, therefore, redundant definitions: “third party” “operator,” and “provider,” for instance, all refer to the company signing the contract.
There are also substantive provisions that may cause issues down the road: the audit clause, for example, could allow multiple districts to simultaneously audit one company. Although the NDPA improves prior language about breach notifications, it does not explicitly define a data breach, thus introducing potential ambiguity about when a notification is required. Furthermore, the NDPA might impact the national edtech market by creating a barrier to entry for smaller companies, which often have little opportunity to negotiate inapplicable or unnecessarily burdensome provisions. SDPC has said that it will continue to develop the NDPA, so feedback about the agreement’s challenges and strengths will be necessary and useful.
Adopting the NDPA will benefit schools, districts, and companies because it establishes a common baseline for these stakeholders to begin student data privacy negotiations. Even if it does not apply perfectly to everyone, the agreement will save contracting resources for all education stakeholders. This efficiency, in turn, will allow schools and districts to better protect student data as they provide important services and education for American students.
Data Protection Expert Agnes Bundy Scanlan and Fundraising and Philanthropy Leader Elaine Laughlin Join Board of Directors
Agnes Bundy Scanlan is a global leader in the governance, law, regulatory and compliance risk, and data information security fields. Agnes is also an experienced legal counsel and advisor to industry and government. She has built a sterling reputation as being proactive, accessible and responsive to governance and regulatory changes, and has received recognition within the corporate world as an expert and leader in these fields. Her background includes contributions in the business community as well as on Capitol Hill and in the Executive Branch.
“Agnes is widely respected in the data protection field for her legal knowledge, integrity, and practical guidance,” said FPF CEO Jules Polonetsky. “Her experience as one of the founders of the professional data protection field will greatly benefit our board and our community.”
Agnes is currently the president of the Cambridge Group, LLC which provides regulatory risk consulting, operational expertise on consumer banking regulations, interim staff augmentation, training guidance, and more. Agnes also currently serves as an Independent Director for Truist where she is a member of the Board Governance and Nominating, Risk, and Trust Committees.
Along with her roles at both Cambridge Group, LLC and Truist, Anges also serves on the Advisory Board for the Tech, Law, and Security Program at American University College of Law. Here, she tackles the challenges and opportunities posed by emerging technology to ensure that technology is used to support the functioning of effective democracies and core democratic values, including privacy, security, freedom of speech, and the discernment of truth.
Prior to her time at the Cambridge Group, Ms. Bundy Scanlan served as a senior adviser for Treliant Risk Advisors where she counseled financial services firms on various matters including strategy, governance, regulatory and risk management matters. She also served as a senior adviser at Treliant from 2012 to 2015.
She holds a J.D. from Georgetown University Law Center and a B.A. from Smith College. She is a member of the Bar of the Supreme Court of the United States, the Bar of the Commonwealth of Massachusetts, the Bar of the Commonwealth of Pennsylvania, and the Bar of the Superior Court of the District of Columbia. She was also formerly the chair of the International Association of Privacy Professionals.
Elaine Laughlin is a proven leader in fundraising and philanthropy. She has been a leader for various public broadcasting outlets over the course of her career. During her time at WETA, Elaine served as the Vice President of Foundation and Government Development. She was instrumental in the production of programs such as PBS NewsHour, Washington Week, the documentaries of Ken Burns, Henry Louis Gates Jr.’s Finding Your Roots and many other public broadcasting favorites.
Elaine is currently the Director of Development at WSBE, southeastern New England’s PBS station. She leads development for the station and is committed to delivering programs and services that educate, inform, enrich, inspire and entertain viewers of all ages.
“We’re already tapping Elaine’s expertise as FPF continues to expand and reach new audiences,” said Polonetsky. “We’re excited to have her knowledge about how growing organizations can deliver a range of impactful programs.”
She holds a B.A. in English from Wellesley College and a Master’s in Communication from Cornell University.
Congratulations to both Agnes and Elaine on joining the Board of Directors!