Data is increasingly the lifeblood of commerce, research, and government. Data analysis is providing new insights into our health, making transportation safer, and increasing the usefulness of online services. It is a key to developing medical breakthroughs, educating students during the pandemic, and combating inequity and discrimination.
But if society is going to enjoy the benefits of data, we need to empower individuals and ensure that businesses respect privacy, safeguard data, and earn trust. That’s our message for Data Privacy Day, celebrated yearly on January 28. To bring the message to a broad audience, we’re pleased to be partnering with Snapchat to provide a privacy-themed Snap filter for Data Privacy Day 2021.
The Future of Privacy Forum (FPF) is a non-profit organization that brings together privacy scholars, researchers, advocates, and industry leaders around the world to explore solutions and advance principled data practices in support of emerging technologies.
If you’re interested in exploring the issues that are driving the future of privacy, sign up for our monthly briefingor check out one of our virtual events. You’ll find some of the smartest minds in privacy, discussing how best to ensure society benefits from the insights gained from data, while respecting individual choice and autonomy.
Why are we and many others around the world marking Data Privacy Day on January 28? It’s the anniversary of an important global data protection treaty known as Convention 108, a standard that has influenced the development of data protection laws around the world. We are hoping that 2021 will be the year the United States steps up and joins the many nations that have passed a comprehensive data protection law!
FPF in 2020: Adjusting to the Unexpected
With 2020 fast coming to a close, we wanted to take a moment to reflect on a year that forced us to re-focus our priorities, along with much of our lives, while continuing to produce engaging events, thought-provoking analysis, and insightful reports.
Considering Privacy & Ethics at the Dawn of a New Decade
Early in the year, our eyes were on the future – at least through the rest of the decade – and how privacy and ethical considerations would impact our lives and new and upcoming technologies over the next ten years.
In February, we hosted the 10th-annual Privacy Papers for Policymakers event. The yearly event recognizes the year’s leading privacy research and analytical work that is relevant for policymakers in the United States Congress, federal agencies, and international data protection authorities. We were honored to be joined by FTC Commissioner Christine S. Wilson, who keynoted the event, and leading privacy scholars and policy and regulatory staff.
“Are corporations having too much power over individuals because of how much data they have? Are foreign countries interfering in our elections? Are automated decisions being made where I’ll be turned down for healthcare, I’ll be turned down for insurance, my probation will be extended? These are not [only] privacy issues, right? These are issues of power. These are issues of human rights at the end of the day.”
Adjusting to a One-in-a-Century Pandemic
By March, it was clear that the COVID-19 pandemic would impact every aspect of our lives, and we moved nimbly to respond and re-assess our immediate priorities. FPF launched the Privacy and Pandemicsseries, a collection of resources published throughout the year that explores the challenges posed by the COVID-19 pandemic to existing ethical, privacy, and data protection frameworks, seeking to provide information and guidance to governments, companies, academics, and civil society organizations interested in responsible data sharing to support the public health response. Some of the initial materials published in the spring as part of this series included:
A Closer Look at Location Data: Privacy & Pandemics. In light of COVID-19, interest grew in harnessing location data held by major tech companies to track individuals affected by the virus, better understand the effectiveness of social distancing, and send alerts to individuals who might be affected. FPF addressed a range of privacy and ethics concerns in a brief explainer guide covering (1) what is location data, (2) who holds it, and (3) how it is collected.
In April, FPF Senior Counsel Stacey Gray provided the Senate Committee on Commerce, Science, and Transportation with written testimony, including recommendations based on how experts in the U.S. and around the world are currently mitigating the risks of using data to combat the COVID-19 pandemic.
Experts have used machine learning technologies to study the virus, test potential treatments, diagnose individuals, analyze the public health impacts, and more. In early May, FPF Policy Counsel Dr. Sara Jordan and FPF Senior Counsel Brenda Leong published a resource covering leading efforts, data protection and ethical issues related to machine learning and COVID-19.
As part of our ongoing Privacy and Pandemics series, FPF, Highmark Health, and Carnegie Mellon University’s CyLab Security and Privacy Institute, hosted a virtual symposium that took an in-depth look at the role of biometrics and privacy in the COVID-19 era. During the virtual symposium, expert discussants and presenters examined the impact of biometrics in the ongoing fight against the novel coronavirus. Discussions about law and policy were enhanced by demonstrations of the latest facial recognition and sensing technology and privacy controls from researchers at CMU’s CyLab.
On October 27 and 28, FPF hosted a workshop titled Privacy & Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis. Dr. Lauren Gardner, creator of Johns Hopkins University’s COVID-19 Dashboard, and UC Berkeley data analytics researcher Dr. Katherine Yelick were keynote speakers. The workshop – held in collaboration with the National Science Foundation, Duke Sanford School of Public Policy, SFI ADAPT Research Centre, Dublin City University, OneTrust, and the Intel Corporation – also featured wide-ranging conversations with participants from the fields of data and computer science, public health, law and policy. Following the workshop, FPF prepared a report for the National Science Foundation to help speed the transition of research into practice to address this challenge of national importance.
Global Expertise & Leadership
FPF’s international work continued to expand in 2020, as policymakers around the world are focused on ways to improve privacy frameworks. More than 120 countries have enacted a privacy or data protection law, and FPF both closely followed and advised upon significant developments in the European Union, Latin America, and Asia.
We were proud to announce our new partnership with Dublin City University (DCU), which will lead to joint conferences and workshops, collaborative research projects, joint resources for policymakers, and applications for research opportunities. DCU is home to some of the leading AI-focused research and scholarship programs in Ireland. DCU is a lead university for the Science Foundation Ireland ADAPT program, and hosts the consortium leadership for the INSIGHT research centre, two of the largest government funded AI and tech-focused development programs. Notably, the collaboration has already resulted in a joint webinar, The Independent and Effective DPO: Legal and Policy Perspectives, and we’re excited about further collaboration in 2021 and beyond.
Following the Prime Minister of Israel’s announcement that the government planned to use technology to address the spread of COVID-19, Limor Shmerling Magazanik, Managing Director of the Israel Tech Policy Institute, published recommendations to ensure a balance between civilian freedoms and public health. Specifically, her recommendations centered around ensuring transparency, limits on the length of time that data is held, requiring a clear purpose for data collection, and robust security.
The Schrems II decision from the Court of Justice of the European Union held serious consequences for dataflows coming from the EY to the United States, as well as to most of the other countries in the world. In advance of the decision, FPF published a guide called, What to Expect from the Court of Justice of the EU in the Schrems II Decision This Week by FPF’s Dr. Gabriela Zanfir Fortuna. FPF also conducted a study of the companies enrolled in the cross-border privacy program called Privacy Shield, finding that 259 European-headquartered companies are active Privacy Shield participants.
Leveraging our growing focus on the international privacy landscape, and as part of our growing work related to education privacy, we published a report titled The General Data Protection Regulation: An Analysis and Guidance for US Higher Education Institutions, authored by FPF Senior Counsel Dr. Gabriela Zanfir-Fortuna. The report contains analysis and guidance to assist U.S.-based higher education institutions and their edtech service providers in assessing their compliance with the European Union’s General Data Protection Regulation.
We hosted several events and roundtables in Europe. On December 2, 2020, the fourth iteration of the Brussels Privacy Symposium, Research and the Protection of Personal Data Under the GDPR, took place as a virtual international meeting where industry privacy leaders, academic researchers, and regulators discussed the present and future of data protection in the context of scientific data-based research and in the age of COVID. The virtual event is the latest aspect of an ongoing partnership between FPF and Vrije Universiteit Brussel (VUB). Keynote speakers were Malte Beyer-Katzenberger, Policy Officer at the European Commission; Dr. Wojciech Wiewiórowski, the European Data Protection Supervisor; and Microsoft’s Cornelia Kutterer. Their presentations sparked engaging conversations on the complex interactions between data protection and research as well as the ways in which processing of sensitive data can present privacy risks, and also unearth covert bias and discrimination.
Scholarship & Analysis on Impactful Topics
The core of our work is providing insightful analysis on prevailing privacy issues. FPF convenes industry experts, academics, consumer advocates, and other thought leaders to explore the challenges posed by technological innovation, and develop privacy protections, ethical norms, and workable business practices. In 2020 – through events, awards, infographic guides, papers, studies, or briefings – FPF provided thoughtful leadership on issues ranging from corporate-academic data sharing to encryption.
In mid-May, the Future of Privacy Forum announced the winners for the first-ever FPF Award for Research Data Stewardship: Professor Mark Steyvers, University of California, Irvine Department of Cognitive Sciences, and Lumos Labs. The first-of-its-kind award recognizes a privacy protective research collaboration between a company and academic researchers, based on the notion that when privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields. In October, FPF hosted a virtual event honoring the winners, featuring – in addition to the awardees – Daniel L. Goroff, Vice President and Program Director at the Alfred P. Sloan Foundation, which funded the award, as well as FPF CEO Jules Polonetsky and FPF Policy Counsel Dr. Sara Jordan.
FPF published an interactive visual guide, Strong Data Encryption Protects Everyone, illustrating how strong encryption protects individuals, enterprises, and the government. The guide also highlights key risks that arise when crypto safeguards are undermined – risks that can expose sensitive health and financial records, undermine the security of critical infrastructure, and enable interception of officials’ confidential communications.
Over the summer, we published interviews with senior FPF policy experts about their work on important privacy issues. As part of this series of internal interviews, we spoke with FPF Health Policy Counsel Dr. Rachele Hendricks-Sturrup, FPF Director of Technology and Privacy ResearchChristy Harris, FPF Managing Director for Europe Rob van Eijk, and FPF Policy Counsel Chelsey Colbert.
FPF Policy Fellow Casey Waughn, supported by Anisha Reddy and Juliana Cotto from FPF, and Antwan Perry, Donna Harris-Aikens, and Justin Thompson at the National Education Association, released new recommendations for the use of video conferencing platforms in online learning. The recommendations ask schools and districts to reconsider requiring students to have their cameras turned on during distanced learning. These requirements create unique privacy and equity risks for students, including increased data collection, an implied lack of trust, and conflating students’ school and home lives.
Following the 2020 election, FPF has hosted several events looking ahead to the policy implications of a new Administration and Congress in 2021, including a roundtable discussion where Jules was joined by Jonathan Baron, Principal of Baron Public Affairs, a leading policy and political risk strategist, as well as FPF’s Global and Europe leads, Dr. Gabriela Zanfir-Fortuna and Rob Van Eijk. In addition, Jules; FPF Senior Fellow Peter Swire; VP of Policy John Verdi; and Senior Counsel Stacey Gray also held a briefing with members of the media to discuss expectations on what the Biden administration, FTC, and states will accomplish on privacy in the coming year. IAPP published an article summarizing the briefing.
Linking Equity & Fairness with Privacy
Alongside a pandemic that forced us to shift our priorities, 2020 saw a needed national reckoning with issues related to diversity and equity. From racial justice and the LGBTQ+ community to child rights, FPF took conscious steps to reflect on, understand, and address essential questions related to equity and fairness in the context of privacy.
The data protection community has particular challenges as we grapple with the many ways that data can be used unfairly. In response, our team has focused on listening and learning from leaders with diverse life and professional experiences to help shape more careful thinking about data and discrimination. As part of that project, we published remarkson diversity and inclusion from Macy’s Chief Privacy Officer and FPF Advisory Board member Michael McCullough delivered at the WireWheel Spokes 2020 conference. We also discussed Ruha Benjamin’s Race After Technology: Abolitionist Tools for the New Jim Code as part of our ongoing book club series and were honored to be joined by the author for the discussion.
LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals by criminalizing and stigmatizing individuals on the basis of their sexual behavior, sexuality, and gender expression. In honor of October as LGBTQ+ History Month, FPF and LGBT Tech explored three of the most significant privacy invasions impacting the LGBTQ+ community in modern U.S. history: anti-sodomy laws; the “Lavender Scare” beginning in the 1950s; and privacy invasions during the HIV/AIDS epidemic. These examples and many more were discussed as part of a LinkedIn Live event on International Human Rights Day, featuring LGBT Tech Executive Director Christopher Wood, FPF Founder and Board Chair Christopher Wolf, LGBT Tech Deputy Director and General Counsel Carlos Gutierrez, FPF Policy Counsel Dr. Sara Jordan, and FPF Christopher Wolf Diversity Law Fellow Katelyn Ringrose.
FPF submitted feedback and comments to the United Nations Children’s Fund (UNICEF) on the Draft Policy Guidance on Artificial Intelligence (AI) for Children, which seeks “to promote children’s rights in government and private sector AI policies and practices, and to raise awareness of how AI systems can uphold or undermine children’s rights.” FPF encouraged UNICEF to adopt an approach that accounts for the diversity of childhood experiences across countries and contexts. Earlier in October, FPF also submitted comments to the United Nations Office of the High Commissioner for Human Rights Special Rapporteur on the right to privacy to inform the Special Rapporteur’s upcoming report on the privacy rights of children. FPF will continue to provide expertise and insight on child and student privacy, AI, and ethics to agencies, governments, and corporations to promote the best interests of children.
This post is by no means an exhaustive list of our most important work in 2020, but we hope it give you a sense of the scope of our impact. On behalf of everyone at FPF, best wishes for 2021!
FPF Health and AI & Ethics Policy Counsels Present a Scientific Position at ICML 2020 and at 2020 CCSQ World Usability Day
Drs. Hendricks-Sturrup and Jordan gave their position that patient reported outcomes (PROs) data requires special privacy and security considerations, due to the nature of the data, if used within on-device or federated machine learning constructs as well as in the development of artificial intelligence platforms. Patient reported outcomes, being a raw form of patient expression and feedback, help clinicians, researchers, medical device and drug manufacturers, and governmental stakeholders overseeing medical device and drug development, distribution, and safety monitor, understand, and document, in a readable format, patients’ symptoms, preferences, complaints, and/or experiences following a clinical intervention. Gathering and using such data requires careful attention to security, data architecture, data use, and machine-readable consent tools and privacy policies.
Even so, on-device patient reported outcome measurement tools, like patient surveys within third-party mobile apps that use machine learning, may employ the best machine-readable privacy policies or consent mechanisms, but may ultimately leave key components of privacy protections up to the patient-user. Keeping data in the hands of users opens those users up to unanticipated vectors of attack from adversaries striving to identify the valuable machine learning models or seeking to uncover data about a specific patient.
Drs. Hendricks-Sturrup and Jordan recommended that developers of patient reported outcome measurement systems leveraging federated learning architectures:
Intentionally design user device security, as well as security for transmission of either data (raw or processed) or model gradients, to the highest level of protections that do not degrade essential performance for critical health and safety monitoring procedures (e.g. remote monitoring for clinical trials, post-market drug safety surveillance, hospital performance scores, etc.);
Ensure that models are not compromised and that valuable machine learning spending is not lost to competitors;
Design systems to operate atop a federated machine learning architecture, both when model components are sent or gradients received, to ensure privacy of users’ data; and
Design learning algorithms with algorithmic privacy techniques, such as including differential privacy, which is essential to secure valuable and sensitive PRO data.
Drs. Hendricks-Sturrup’s and Jordan’s paper, poster, and presentation can be found at these links:
To learn more about FPF’s Health & Genetics and AI & Ethics initiatives, contact Drs. Hendricks-Sturrup and Jordan, respectively, at: [email protected] and [email protected].
Workshop Report: Privacy & Pandemics – Responsible Use of Data During Times of Crisis
In October 2020, the Future of Privacy Forum (FPF) convened a virtual workshop entitled “Privacy and Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis” with invited computer science, privacy law, public policy, social science, and health information experts from around the world to examine benefits, risks, and strategies for the collection and use of data in support of public health initiatives in response to COVID-19 and for consideration of future public health crises. With support from the National Science Foundation, Intel Corporation, Duke Sanford School of Public Policy, and Dublin City University’s SFI ADAPT Research Centre, the workshop identified research priorities to improve data governance systems and structures in the context of the COVID-19 pandemic.
Drawing on the expertise of workshop participant submissions and session discussions, FPF prepared a workshop report which was submitted to the National Science Foundation for use in planning the Convergence Accelerator 2021 Workshops. This NSF program aims to speed the transition of convergence research into practice to address grand challenges of national importance. The final submitted workshop report is also available on our website.
Based on analysis of expert positions reflected in the workshop, FPF recommends NSF consider the following roadmap for research, practice improvements, and development of privacy-preserving products and services to inform responses to the COVID-19 crisis and in preparation for future pandemics or other public crises:
Support the refinement and application of existing privacy-preserving and privacy- enhancing technologies that can support public health goals while mitigating privacy risks, including: decentralized contact tracing, homomorphic encryption, and differential privacy;
Support the development of emerging privacy-enhancing technologies that hold promise in the public health sphere, including: synthetic data, controlled access environments, digital twins, and simulations;
Support cross-disciplinary research into privacy-protective approaches to key emerging technologies, including: Wireless Sensor Networks and data processing strategies for on- device and/or centralized analysis of personal health information;
Explore mechanisms to balance the need for increased access to data that allows researchers to understand the differential impacts of crises on certain communities by not obscuring critical community characteristics with privacy-enhancing technologies;
Convene cross-disciplinary experts to create and refine guidance for implementation of privacy protections suited to crisis situations;
Identify top-priority updates to laws and regulations pertaining to public health;
Explore mechanisms that promote data interoperability while promoting privacy;
Promote the development of promising de-identification technologies and mitigation strategies to address re-identification risks;
Promote practical, implementable ethical frameworks that go beyond the FAIR principles; and
Identify practical lessons learned during the COVID-19 pandemic regarding publication ethics and norms for research in a time of crisis that can apply to future crises.