FPF in 2020: Adjusting to the Unexpected
With 2020 fast coming to a close, we wanted to take a moment to reflect on a year that forced us to re-focus our priorities, along with much of our lives, while continuing to produce engaging events, thought-provoking analysis, and insightful reports.
Considering Privacy & Ethics at the Dawn of a New Decade
Early in the year, our eyes were on the future – at least through the rest of the decade – and how privacy and ethical considerations would impact our lives and new and upcoming technologies over the next ten years.
The Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade white paper, co-authored by FPF CEO Jules Polonetsky and hacklawyER Founder Elizabeth Renieris, helped corporate officers, nonprofit leaders, and policymakers better understand the privacy risks that will grow to prominence during the 2020s, as well as rising technologies that will be used to help manage privacy through the decade.
In February, we hosted the 10th-annual Privacy Papers for Policymakers event. The yearly event recognizes the year’s leading privacy research and analytical work that is relevant for policymakers in the United States Congress, federal agencies, and international data protection authorities. We were honored to be joined by FTC Commissioner Christine S. Wilson, who keynoted the event, and leading privacy scholars and policy and regulatory staff.
FPF CEO Jules Polonetsky delivered a keynote address at RSA Conference 2020 in San Francisco, Navigating Privacy in a Data-Driven World: Treating Privacy as a Human Right. In his speech, Jules outlined the limitations of consumer protection laws in protecting individuals’ privacy and explored how to best safeguard data to protect human rights.
“Are corporations having too much power over individuals because of how much data they have? Are foreign countries interfering in our elections? Are automated decisions being made where I’ll be turned down for healthcare, I’ll be turned down for insurance, my probation will be extended? These are not [only] privacy issues, right? These are issues of power. These are issues of human rights at the end of the day.”
Adjusting to a One-in-a-Century Pandemic
By March, it was clear that the COVID-19 pandemic would impact every aspect of our lives, and we moved nimbly to respond and re-assess our immediate priorities. FPF launched the Privacy and Pandemicsseries, a collection of resources published throughout the year that explores the challenges posed by the COVID-19 pandemic to existing ethical, privacy, and data protection frameworks, seeking to provide information and guidance to governments, companies, academics, and civil society organizations interested in responsible data sharing to support the public health response. Some of the initial materials published in the spring as part of this series included:
- A Closer Look at Location Data: Privacy & Pandemics. In light of COVID-19, interest grew in harnessing location data held by major tech companies to track individuals affected by the virus, better understand the effectiveness of social distancing, and send alerts to individuals who might be affected. FPF addressed a range of privacy and ethics concerns in a brief explainer guide covering (1) what is location data, (2) who holds it, and (3) how it is collected.
- FAQs: Disclosing Student Health Information During the COVID-19 Pandemic. FPF and AASA, The School Superintendents Association, released a white paper offering guidance to help K-12 and higher education administrators and educators protect student privacy during the COVID-19 pandemic.
- Privacy & Pandemics: The Role of Mobile Apps (Chart). Multiple apps and software development kits (SDK) were deployed to help both privacy and public entities tackle the COVID-19 pandemic. In order to better understand these technologies, FPF created a comparison chart to contrast the objectives and methods of specific SDK apps.
In April, FPF Senior Counsel Stacey Gray provided the Senate Committee on Commerce, Science, and Transportation with written testimony, including recommendations based on how experts in the U.S. and around the world are currently mitigating the risks of using data to combat the COVID-19 pandemic.
Experts have used machine learning technologies to study the virus, test potential treatments, diagnose individuals, analyze the public health impacts, and more. In early May, FPF Policy Counsel Dr. Sara Jordan and FPF Senior Counsel Brenda Leong published a resource covering leading efforts, data protection and ethical issues related to machine learning and COVID-19.
As part of our ongoing Privacy and Pandemics series, FPF, Highmark Health, and Carnegie Mellon University’s CyLab Security and Privacy Institute, hosted a virtual symposium that took an in-depth look at the role of biometrics and privacy in the COVID-19 era. During the virtual symposium, expert discussants and presenters examined the impact of biometrics in the ongoing fight against the novel coronavirus. Discussions about law and policy were enhanced by demonstrations of the latest facial recognition and sensing technology and privacy controls from researchers at CMU’s CyLab.
On October 27 and 28, FPF hosted a workshop titled Privacy & Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis. Dr. Lauren Gardner, creator of Johns Hopkins University’s COVID-19 Dashboard, and UC Berkeley data analytics researcher Dr. Katherine Yelick were keynote speakers. The workshop – held in collaboration with the National Science Foundation, Duke Sanford School of Public Policy, SFI ADAPT Research Centre, Dublin City University, OneTrust, and the Intel Corporation – also featured wide-ranging conversations with participants from the fields of data and computer science, public health, law and policy. Following the workshop, FPF prepared a report for the National Science Foundation to help speed the transition of research into practice to address this challenge of national importance.
Global Expertise & Leadership
FPF’s international work continued to expand in 2020, as policymakers around the world are focused on ways to improve privacy frameworks. More than 120 countries have enacted a privacy or data protection law, and FPF both closely followed and advised upon significant developments in the European Union, Latin America, and Asia.
We were proud to announce our new partnership with Dublin City University (DCU), which will lead to joint conferences and workshops, collaborative research projects, joint resources for policymakers, and applications for research opportunities. DCU is home to some of the leading AI-focused research and scholarship programs in Ireland. DCU is a lead university for the Science Foundation Ireland ADAPT program, and hosts the consortium leadership for the INSIGHT research centre, two of the largest government funded AI and tech-focused development programs. Notably, the collaboration has already resulted in a joint webinar, The Independent and Effective DPO: Legal and Policy Perspectives, and we’re excited about further collaboration in 2021 and beyond.
Following the Prime Minister of Israel’s announcement that the government planned to use technology to address the spread of COVID-19, Limor Shmerling Magazanik, Managing Director of the Israel Tech Policy Institute, published recommendations to ensure a balance between civilian freedoms and public health. Specifically, her recommendations centered around ensuring transparency, limits on the length of time that data is held, requiring a clear purpose for data collection, and robust security.
The Schrems II decision from the Court of Justice of the European Union held serious consequences for dataflows coming from the EY to the United States, as well as to most of the other countries in the world. In advance of the decision, FPF published a guide called, What to Expect from the Court of Justice of the EU in the Schrems II Decision This Week by FPF’s Dr. Gabriela Zanfir Fortuna. FPF also conducted a study of the companies enrolled in the cross-border privacy program called Privacy Shield, finding that 259 European-headquartered companies are active Privacy Shield participants.
We released many papers and blog posts analyzing privacy legislation in the EU, Brazil, South Korea, Singapore, India, Canada, New Zealand and elsewhere. One example was the white paper published in May titled, New Decade, New Priorities: A summary of twelve European Data Protection Authorities strategic and operational plans for 2020 and beyond. The paper provides guidance on the priorities and focus areas considered top concerns among European Data Protection Authorities (DPAs) for the 2020s and beyond.
Leveraging our growing focus on the international privacy landscape, and as part of our growing work related to education privacy, we published a report titled The General Data Protection Regulation: An Analysis and Guidance for US Higher Education Institutions, authored by FPF Senior Counsel Dr. Gabriela Zanfir-Fortuna. The report contains analysis and guidance to assist U.S.-based higher education institutions and their edtech service providers in assessing their compliance with the European Union’s General Data Protection Regulation.
We hosted several events and roundtables in Europe. On December 2, 2020, the fourth iteration of the Brussels Privacy Symposium, Research and the Protection of Personal Data Under the GDPR, took place as a virtual international meeting where industry privacy leaders, academic researchers, and regulators discussed the present and future of data protection in the context of scientific data-based research and in the age of COVID. The virtual event is the latest aspect of an ongoing partnership between FPF and Vrije Universiteit Brussel (VUB). Keynote speakers were Malte Beyer-Katzenberger, Policy Officer at the European Commission; Dr. Wojciech Wiewiórowski, the European Data Protection Supervisor; and Microsoft’s Cornelia Kutterer. Their presentations sparked engaging conversations on the complex interactions between data protection and research as well as the ways in which processing of sensitive data can present privacy risks, and also unearth covert bias and discrimination.
Scholarship & Analysis on Impactful Topics
The core of our work is providing insightful analysis on prevailing privacy issues. FPF convenes industry experts, academics, consumer advocates, and other thought leaders to explore the challenges posed by technological innovation, and develop privacy protections, ethical norms, and workable business practices. In 2020 – through events, awards, infographic guides, papers, studies, or briefings – FPF provided thoughtful leadership on issues ranging from corporate-academic data sharing to encryption.
In mid-May, the Future of Privacy Forum announced the winners for the first-ever FPF Award for Research Data Stewardship: Professor Mark Steyvers, University of California, Irvine Department of Cognitive Sciences, and Lumos Labs. The first-of-its-kind award recognizes a privacy protective research collaboration between a company and academic researchers, based on the notion that when privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields. In October, FPF hosted a virtual event honoring the winners, featuring – in addition to the awardees – Daniel L. Goroff, Vice President and Program Director at the Alfred P. Sloan Foundation, which funded the award, as well as FPF CEO Jules Polonetsky and FPF Policy Counsel Dr. Sara Jordan.
Expanding upon its industry-leading best practices, in July FPF published Consumer Genetic Testing Companies & The Role of Transparency Reports in Revealing Government Requests for User Data, examining how leading consumer genetic testing companies require valid legal processes before disclosing consumer genetic information to the government, and how companies publish transparency reports around such disclosures.
FPF published an interactive visual guide, Strong Data Encryption Protects Everyone, illustrating how strong encryption protects individuals, enterprises, and the government. The guide also highlights key risks that arise when crypto safeguards are undermined – risks that can expose sensitive health and financial records, undermine the security of critical infrastructure, and enable interception of officials’ confidential communications.
Over the summer, we published interviews with senior FPF policy experts about their work on important privacy issues. As part of this series of internal interviews, we spoke with FPF Health Policy Counsel Dr. Rachele Hendricks-Sturrup, FPF Director of Technology and Privacy Research Christy Harris, FPF Managing Director for Europe Rob van Eijk, and FPF Policy Counsel Chelsey Colbert.
FPF Policy Fellow Casey Waughn, supported by Anisha Reddy and Juliana Cotto from FPF, and Antwan Perry, Donna Harris-Aikens, and Justin Thompson at the National Education Association, released new recommendations for the use of video conferencing platforms in online learning. The recommendations ask schools and districts to reconsider requiring students to have their cameras turned on during distanced learning. These requirements create unique privacy and equity risks for students, including increased data collection, an implied lack of trust, and conflating students’ school and home lives.
Following the 2020 election, FPF has hosted several events looking ahead to the policy implications of a new Administration and Congress in 2021, including a roundtable discussion where Jules was joined by Jonathan Baron, Principal of Baron Public Affairs, a leading policy and political risk strategist, as well as FPF’s Global and Europe leads, Dr. Gabriela Zanfir-Fortuna and Rob Van Eijk. In addition, Jules; FPF Senior Fellow Peter Swire; VP of Policy John Verdi; and Senior Counsel Stacey Gray also held a briefing with members of the media to discuss expectations on what the Biden administration, FTC, and states will accomplish on privacy in the coming year. IAPP published an article summarizing the briefing.
Linking Equity & Fairness with Privacy
Alongside a pandemic that forced us to shift our priorities, 2020 saw a needed national reckoning with issues related to diversity and equity. From racial justice and the LGBTQ+ community to child rights, FPF took conscious steps to reflect on, understand, and address essential questions related to equity and fairness in the context of privacy.
The data protection community has particular challenges as we grapple with the many ways that data can be used unfairly. In response, our team has focused on listening and learning from leaders with diverse life and professional experiences to help shape more careful thinking about data and discrimination. As part of that project, we published remarks on diversity and inclusion from Macy’s Chief Privacy Officer and FPF Advisory Board member Michael McCullough delivered at the WireWheel Spokes 2020 conference. We also discussed Ruha Benjamin’s Race After Technology: Abolitionist Tools for the New Jim Code as part of our ongoing book club series and were honored to be joined by the author for the discussion.
LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals by criminalizing and stigmatizing individuals on the basis of their sexual behavior, sexuality, and gender expression. In honor of October as LGBTQ+ History Month, FPF and LGBT Tech explored three of the most significant privacy invasions impacting the LGBTQ+ community in modern U.S. history: anti-sodomy laws; the “Lavender Scare” beginning in the 1950s; and privacy invasions during the HIV/AIDS epidemic. These examples and many more were discussed as part of a LinkedIn Live event on International Human Rights Day, featuring LGBT Tech Executive Director Christopher Wood, FPF Founder and Board Chair Christopher Wolf, LGBT Tech Deputy Director and General Counsel Carlos Gutierrez, FPF Policy Counsel Dr. Sara Jordan, and FPF Christopher Wolf Diversity Law Fellow Katelyn Ringrose.
FPF submitted feedback and comments to the United Nations Children’s Fund (UNICEF) on the Draft Policy Guidance on Artificial Intelligence (AI) for Children, which seeks “to promote children’s rights in government and private sector AI policies and practices, and to raise awareness of how AI systems can uphold or undermine children’s rights.” FPF encouraged UNICEF to adopt an approach that accounts for the diversity of childhood experiences across countries and contexts. Earlier in October, FPF also submitted comments to the United Nations Office of the High Commissioner for Human Rights Special Rapporteur on the right to privacy to inform the Special Rapporteur’s upcoming report on the privacy rights of children. FPF will continue to provide expertise and insight on child and student privacy, AI, and ethics to agencies, governments, and corporations to promote the best interests of children.
This post is by no means an exhaustive list of our most important work in 2020, but we hope it give you a sense of the scope of our impact. On behalf of everyone at FPF, best wishes for 2021!