Snap into Data Privacy Day

Data is increasingly the lifeblood of commerce, research, and government. Data analysis is providing new insights into our health, making transportation safer, and increasing the usefulness of online services. It is a key to developing medical breakthroughs, educating students during the pandemic, and combating inequity and discrimination. 

But if society is going to enjoy the benefits of data, we need to empower individuals and ensure that businesses respect privacy, safeguard data, and earn trust. That’s our message for Data Privacy Day, celebrated yearly on January 28. To bring the message to a broad audience, we’re pleased to be partnering with Snapchat to provide a privacy-themed Snap filter for Data Privacy Day 2021.

Globalprivacyday Snapcode
Scan here to access the filter – it’s only available on Jan. 28, 2021! 

The Future of Privacy Forum (FPF) is a non-profit organization that brings together privacy scholars, researchers, advocates, and industry leaders around the world to explore solutions and advance principled data practices in support of emerging technologies. 

We also help individuals think about privacy choices. We’ve explored how to shop for an extended reality gaming system, the privacy considerations that go into purchasing a genetic test, and whether it’s a good idea to require students to turn on their video feeds during virtual classes.

We analyze the legal and policy issues that impact consumer privacy now and in the future. If you want to get a sense of the those topics, check out our infographics on connected carsartificial intelligence, and encryption. Or explore our wealth of resources regarding data protection for studentsparentseducators, and more.

If you’re interested in exploring the issues that are driving the future of privacy, sign up for our monthly briefing or check out one of our virtual events. You’ll find some of the smartest minds in privacy, discussing how best to ensure society benefits from the insights gained from data, while respecting individual choice and autonomy. 

P.S.

Why are we and many others around the world marking Data Privacy Day on January 28? It’s the anniversary of an important global data protection treaty known as Convention 108, a standard that has influenced the development of data protection laws around the world. We are hoping that 2021 will be the year the United States steps up and joins the many nations that have passed a comprehensive data protection law! 

FPF in 2020: Adjusting to the Unexpected

With 2020 fast coming to a close, we wanted to take a moment to reflect on a year that forced us to re-focus our priorities, along with much of our lives, while continuing to produce engaging events, thought-provoking analysis, and insightful reports. 

Considering Privacy & Ethics at the Dawn of a New Decade

Early in the year, our eyes were on the future – at least through the rest of the decade – and how privacy and ethical considerations would impact our lives and new and upcoming technologies over the next ten years. 

The Privacy 2020: 10 Privacy Risks and 10 Privacy Enhancing Technologies to Watch in the Next Decade white paper, co-authored by FPF CEO Jules Polonetsky and hacklawyER Founder Elizabeth Renieris, helped corporate officers, nonprofit leaders, and policymakers better understand the privacy risks that will grow to prominence during the 2020s, as well as rising technologies that will be used to help manage privacy through the decade.

In February, we hosted the 10th-annual Privacy Papers for Policymakers event. The yearly event recognizes the year’s leading privacy research and analytical work that is relevant for policymakers in the United States Congress, federal agencies, and international data protection authorities. We were honored to be joined by FTC Commissioner Christine S. Wilson, who keynoted the event, and leading privacy scholars and policy and regulatory staff. 

FPF CEO Jules Polonetsky delivered a keynote address at RSA Conference 2020 in San Francisco, Navigating Privacy in a Data-Driven World: Treating Privacy as a Human Right. In his speech, Jules outlined the limitations of consumer protection laws in protecting individuals’ privacy and explored how to best safeguard data to protect human rights. 

“Are corporations having too much power over individuals because of how much data they have? Are foreign countries interfering in our elections? Are automated decisions being made where I’ll be turned down for healthcare, I’ll be turned down for insurance, my probation will be extended? These are not [only] privacy issues, right? These are issues of power. These are issues of human rights at the end of the day.”

Adjusting to a One-in-a-Century Pandemic

By March, it was clear that the COVID-19 pandemic would impact every aspect of our lives, and we moved nimbly to respond and re-assess our immediate priorities. FPF launched the Privacy and Pandemicsseries, a collection of resources published throughout the year that explores the challenges posed by the COVID-19 pandemic to existing ethical, privacy, and data protection frameworks, seeking to provide information and guidance to governments, companies, academics, and civil society organizations interested in responsible data sharing to support the public health response. Some of the initial materials published in the spring as part of this series included: 

In April, FPF Senior Counsel Stacey Gray provided the Senate Committee on Commerce, Science, and Transportation with written testimony, including recommendations based on how experts in the U.S. and around the world are currently mitigating the risks of using data to combat the COVID-19 pandemic. 

Experts have used machine learning technologies to study the virus, test potential treatments, diagnose individuals, analyze the public health impacts, and more. In early May, FPF Policy Counsel Dr. Sara Jordan and FPF Senior Counsel Brenda Leong published a resource covering leading efforts, data protection and ethical issues related to machine learning and COVID-19. 

As part of our ongoing Privacy and Pandemics series, FPF, Highmark Health, and Carnegie Mellon University’s CyLab Security and Privacy Institute, hosted a virtual symposium that took an in-depth look at the role of biometrics and privacy in the COVID-19 era. During the virtual symposium, expert discussants and presenters examined the impact of biometrics in the ongoing fight against the novel coronavirus. Discussions about law and policy were enhanced by demonstrations of the latest facial recognition and sensing technology and privacy controls from researchers at CMU’s CyLab. 

On October 27 and 28, FPF hosted a workshop titled Privacy & Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis. Dr. Lauren Gardner, creator of Johns Hopkins University’s COVID-19 Dashboard, and UC Berkeley data analytics researcher Dr. Katherine Yelick were keynote speakers. The workshop – held in collaboration with the National Science Foundation, Duke Sanford School of Public Policy, SFI ADAPT Research Centre, Dublin City University, OneTrust, and the Intel Corporation – also featured wide-ranging conversations with participants from the fields of data and computer science, public health, law and policy. Following the workshop, FPF prepared a report for the National Science Foundation to help speed the transition of research into practice to address this challenge of national importance. 

Global Expertise & Leadership

FPF’s international work continued to expand in 2020, as policymakers around the world are focused on ways to improve privacy frameworks. More than 120 countries have enacted a privacy or data protection law, and FPF both closely followed and advised upon significant developments in the European Union, Latin America, and Asia. 

We were proud to announce our new partnership with Dublin City University (DCU), which will lead to joint conferences and workshops, collaborative research projects, joint resources for policymakers, and applications for research opportunities. DCU is home to some of the leading AI-focused research and scholarship programs in Ireland. DCU is a lead university for the Science Foundation Ireland ADAPT program, and hosts the consortium leadership for the INSIGHT research centre, two of the largest government funded AI and tech-focused development programs. Notably, the collaboration has already resulted in a joint webinar, The Independent and Effective DPO: Legal and Policy Perspectives, and we’re excited about further collaboration in 2021 and beyond.

Following the Prime Minister of Israel’s announcement that the government planned to use technology to address the spread of COVID-19, Limor Shmerling Magazanik, Managing Director of the Israel Tech Policy Institute, published recommendations to ensure a balance between civilian freedoms and public health. Specifically, her recommendations centered around ensuring transparency, limits on the length of time that data is held, requiring a clear purpose for data collection, and robust security. 

The Schrems II decision from the Court of Justice of the European Union held serious consequences for dataflows coming from the EY to the United States, as well as to most of the other countries in the world. In advance of the decision, FPF published a guide called, What to Expect from the Court of Justice of the EU in the Schrems II Decision This Week by FPF’s Dr. Gabriela Zanfir Fortuna. FPF also conducted a study of the companies enrolled in the cross-border privacy program called Privacy Shield, finding that 259 European-headquartered companies are active Privacy Shield participants. 

We released many papers and blog posts analyzing privacy legislation in the EU, Brazil, South Korea, Singapore, India, Canada, New Zealand and elsewhere. One example was the white paper published in May titled, New Decade, New Priorities: A summary of twelve European Data Protection Authorities strategic and operational plans for 2020 and beyond. The paper provides guidance on the priorities and focus areas considered top concerns among European Data Protection Authorities (DPAs) for the 2020s and beyond. 

Leveraging our growing focus on the international privacy landscape, and as part of our growing work related to education privacy, we published a report titled The General Data Protection Regulation: An Analysis and Guidance for US Higher Education Institutions, authored by FPF Senior Counsel Dr. Gabriela Zanfir-Fortuna. The report contains analysis and guidance to assist U.S.-based higher education institutions and their edtech service providers in assessing their compliance with the European Union’s General Data Protection Regulation. 

We hosted several events and roundtables in Europe. On December 2, 2020, the fourth iteration of the Brussels Privacy Symposium, Research and the Protection of Personal Data Under the GDPR, took place as a virtual international meeting where industry privacy leaders, academic researchers, and regulators discussed the present and future of data protection in the context of scientific data-based research and in the age of COVID. The virtual event is the latest aspect of an ongoing partnership between FPF and Vrije Universiteit Brussel (VUB). Keynote speakers were Malte Beyer-Katzenberger, Policy Officer at the European Commission; Dr. Wojciech Wiewiórowski, the European Data Protection Supervisor; and Microsoft’s Cornelia Kutterer. Their presentations sparked engaging conversations on the complex interactions between data protection and research as well as the ways in which processing of sensitive data can present privacy risks, and also unearth covert bias and discrimination. 

Scholarship & Analysis on Impactful Topics

The core of our work is providing insightful analysis on prevailing privacy issues. FPF convenes industry experts, academics, consumer advocates, and other thought leaders to explore the challenges posed by technological innovation, and develop privacy protections, ethical norms, and workable business practices. In 2020 – through events, awards, infographic guides, papers, studies, or briefings – FPF provided thoughtful leadership on issues ranging from corporate-academic data sharing to encryption. 

In mid-May, the Future of Privacy Forum announced the winners for the first-ever FPF Award for Research Data Stewardship: Professor Mark Steyvers, University of California, Irvine Department of Cognitive Sciences, and Lumos Labs. The first-of-its-kind award recognizes a privacy protective research collaboration between a company and academic researchers, based on the notion that when privately held data is responsibly shared with academic researchers, it can support significant progress in medicine, public health, education, social science, and other fields. In October, FPF hosted a virtual event honoring the winners, featuring – in addition to the awardees – Daniel L. Goroff, Vice President and Program Director at the Alfred P. Sloan Foundation, which funded the award, as well as FPF CEO Jules Polonetsky and FPF Policy Counsel Dr. Sara Jordan.  

Expanding upon its industry-leading best practices, in July FPF published Consumer Genetic Testing Companies & The Role of Transparency Reports in Revealing Government Requests for User Data, examining how leading consumer genetic testing companies require valid legal processes before disclosing consumer genetic information to the government, and how companies publish transparency reports around such disclosures. 

FPF published an interactive visual guide, Strong Data Encryption Protects Everyone, illustrating how strong encryption protects individuals, enterprises, and the government. The guide also highlights key risks that arise when crypto safeguards are undermined – risks that can expose sensitive health and financial records, undermine the security of critical infrastructure, and enable interception of officials’ confidential communications. 

Over the summer, we published interviews with senior FPF policy experts about their work on important privacy issues. As part of this series of internal interviews, we spoke with FPF Health Policy Counsel Dr. Rachele Hendricks-Sturrup, FPF Director of Technology and Privacy Research Christy Harris, FPF Managing Director for Europe Rob van Eijk, and FPF Policy Counsel Chelsey Colbert.

FPF Policy Fellow Casey Waughn, supported by Anisha Reddy and Juliana Cotto from FPF, and Antwan Perry, Donna Harris-Aikens, and Justin Thompson at the National Education Association, released new recommendations for the use of video conferencing platforms in online learning. The recommendations ask schools and districts to reconsider requiring students to have their cameras turned on during distanced learning. These requirements create unique privacy and equity risks for students, including increased data collection, an implied lack of trust, and conflating students’ school and home lives.

Following the 2020 election, FPF has hosted several events looking ahead to the policy implications of a new Administration and Congress in 2021, including a roundtable discussion where Jules was joined by Jonathan Baron, Principal of Baron Public Affairs, a leading policy and political risk strategist, as well as FPF’s Global and Europe leads, Dr. Gabriela Zanfir-Fortuna and Rob Van Eijk. In addition, Jules; FPF Senior Fellow Peter Swire; VP of Policy John Verdi; and Senior Counsel Stacey Gray also held a briefing with members of the media to discuss expectations on what the Biden administration, FTC, and states will accomplish on privacy in the coming year. IAPP published an article summarizing the briefing.

Linking Equity & Fairness with Privacy

Alongside a pandemic that forced us to shift our priorities, 2020 saw a needed national reckoning with issues related to diversity and equity. From racial justice and the LGBTQ+ community to child rights, FPF took conscious steps to reflect on, understand, and address essential questions related to equity and fairness in the context of privacy. 

The data protection community has particular challenges as we grapple with the many ways that data can be used unfairly. In response, our team has focused on listening and learning from leaders with diverse life and professional experiences to help shape more careful thinking about data and discrimination. As part of that project, we published remarks on diversity and inclusion from Macy’s Chief Privacy Officer and FPF Advisory Board member Michael McCullough delivered at the WireWheel Spokes 2020 conference. We also discussed Ruha Benjamin’s Race After Technology: Abolitionist Tools for the New Jim Code as part of our ongoing book club series and were honored to be joined by the author for the discussion. 

LGBTQ+ rights are, and have always been, linked with privacy. Over the years, privacy-invasive laws, practices, and norms have been used to oppress LGBTQ+ individuals by criminalizing and stigmatizing individuals on the basis of their sexual behavior, sexuality, and gender expression. In honor of October as LGBTQ+ History Month, FPF and LGBT Tech explored three of the most significant privacy invasions impacting the LGBTQ+ community in modern U.S. history: anti-sodomy laws; the “Lavender Scare” beginning in the 1950s; and privacy invasions during the HIV/AIDS epidemic. These examples and many more were discussed as part of a LinkedIn Live event on International Human Rights Day, featuring LGBT Tech Executive Director Christopher Wood, FPF Founder and Board Chair Christopher Wolf, LGBT Tech Deputy Director and General Counsel Carlos Gutierrez, FPF Policy Counsel Dr. Sara Jordan, and FPF Christopher Wolf Diversity Law Fellow Katelyn Ringrose. 

FPF submitted feedback and comments to the United Nations Children’s Fund (UNICEF) on the Draft Policy Guidance on Artificial Intelligence (AI) for Children, which seeks “to promote children’s rights in government and private sector AI policies and practices, and to raise awareness of how AI systems can uphold or undermine children’s rights.” FPF encouraged UNICEF to adopt an approach that accounts for the diversity of childhood experiences across countries and contexts. Earlier in October, FPF also submitted comments to the United Nations Office of the High Commissioner for Human Rights Special Rapporteur on the right to privacy to inform the Special Rapporteur’s upcoming report on the privacy rights of children. FPF will continue to provide expertise and insight on child and student privacy, AI, and ethics to agencies, governments, and corporations to promote the best interests of children. 

This post is by no means an exhaustive list of our most important work in 2020, but we hope it give you a sense of the scope of our impact. On behalf of everyone at FPF, best wishes for 2021!  

FPF Health and AI & Ethics Policy Counsels Present a Scientific Position at ICML 2020 and at 2020 CCSQ World Usability Day

On November 12, 2020, FPF Policy Counsels Drs. Rachele Hendricks-Sturrup and Sara Jordan presented privacy-by-design alongside human-centered design concepts during the 2020 CCSQ World Usability Day virtual conference. This presentation followed Drs. Hendricks-Sturrup’s and Jordan’s July 2020 scientific position paper presented at the International Conference on Machine Learning (ICML) 2020, entitled “Patient- Reported Outcomes: A Privacy-Centric and Federated Approach to Machine Learning.”

Drs. Hendricks-Sturrup and Jordan gave their position that patient reported outcomes (PROs) data requires special privacy and security considerations, due to the nature of the data, if used within on-device or federated machine learning constructs as well as in the development of artificial intelligence platforms. Patient reported outcomes, being a raw form of patient expression and feedback, help clinicians, researchers, medical device and drug manufacturers, and governmental stakeholders overseeing medical device and drug development, distribution, and safety monitor, understand, and document, in a readable format, patients’ symptoms, preferences, complaints, and/or experiences following a clinical intervention. Gathering and using such data requires careful attention to security, data architecture, data use, and machine-readable consent tools and privacy policies.

Even so, on-device patient reported outcome measurement tools, like patient surveys within third-party mobile apps that use machine learning, may employ the best machine-readable privacy policies or consent mechanisms, but may ultimately leave key components of privacy protections up to the patient-user. Keeping data in the hands of users opens those users up to unanticipated vectors of attack from adversaries striving to identify the valuable machine learning models or seeking to uncover data about a specific patient. 

Drs. Hendricks-Sturrup and Jordan recommended that developers of patient reported outcome measurement systems leveraging federated learning architectures:

  1. Intentionally design user device security, as well as security for transmission of either data (raw or processed) or model gradients, to the highest level of protections that do not degrade essential performance for critical health and safety monitoring procedures (e.g. remote monitoring for clinical trials, post-market drug safety surveillance, hospital performance scores, etc.);
  2. Ensure that models are not compromised and that valuable machine learning spending is not lost to competitors; 
  3. Design systems to operate atop a federated machine learning architecture, both when model components are sent or gradients received, to ensure privacy of users’ data; and  
  4. Design learning algorithms with algorithmic privacy techniques, such as including differential privacy, which is essential to secure valuable and sensitive PRO data. 

Drs. Hendricks-Sturrup’s and Jordan’s paper, poster, and presentation can be found at these links:

ICML paper

ICML poster

CCSQ World Usability Day recording

CCSQ World Usability Day slides

To learn more about FPF’s Health & Genetics and AI & Ethics initiatives, contact Drs. Hendricks-Sturrup and Jordan, respectively, at: [email protected] and [email protected]

Workshop Report: Privacy & Pandemics – Responsible Use of Data During Times of Crisis

In October 2020, the Future of Privacy Forum (FPF) convened a virtual workshop entitled “Privacy and Pandemics: Responsible Uses of Technology and Health Data During Times of Crisis” with invited computer science, privacy law, public policy, social science, and health information experts from around the world to examine benefits, risks, and strategies for the collection and use of data in support of public health initiatives in response to COVID-19 and for consideration of future public health crises. With support from the National Science Foundation, Intel Corporation, Duke Sanford School of Public Policy, and Dublin City University’s SFI ADAPT Research Centre, the workshop identified research priorities to improve data governance systems and structures in the context of the COVID-19 pandemic. 

To learn more about FPF’s work related to privacy and pandemics, please visit the Privacy & Pandemics page.

Drawing on the expertise of workshop participant submissions and session discussions, FPF prepared a workshop report which was submitted to the National Science Foundation for use in planning the Convergence Accelerator 2021 Workshops. This NSF program aims to speed the transition of convergence research into practice to address grand challenges of national importance. The final submitted workshop report is also available on our website.

Based on analysis of expert positions reflected in the workshop, FPF recommends NSF consider the following roadmap for research, practice improvements, and development of privacy-preserving products and services to inform responses to the COVID-19 crisis and in preparation for future pandemics or other public crises:

To learn more about the Privacy & Pandemics conference, including information about the topics, participants, sessions, presentations, and to read the final workshop report, head to the event page

The European Commission Considers Amending the General Data Protection Regulation to Make Digital Age of Consent Consistent

The European Commission published a Communication on its mandated two-year evaluation of the General Data Protection Regulation (GDPR) on June 24, 2020 in which it discusses as a future policy development “the possible harmonisation of the age of children consent in relation to information society services.” Notably, harmonizing the age of consent for children across the European Union is one of only two areas in the GDPR that the Commission is considering amending after further review of practice and case-law. Currently, the GDPR allows individual Member States some flexibility in determining the national age of digital consent for children between the ages of 13 and 16. However, upon the two-year review, the Commission expressed concerns that the variation in ages across the EU results in a level of uncertainty for information society services–any economic activities taking place online–and may hamper “cross-border business, innovation, in particular as regards new technological developments and cybersecurity solutions.”

“For the effective functioning of the internal market and to avoid unnecessary burden on companies, it is also essential that national legislation does not go beyond the margins set by the GDPR or introduces additional requirements when there is no margin,” stated the Commission in its report. Some believe stringent child privacy requirements can push companies to abandon the development of online services for children to avoid legal risks and technical burdens, which creates a void for companies from countries with lax child privacy protections. In addition to the GDPR’s varying ages of digital consent, there are also differing interpretations of the obligations on information society services regarding children. For example, the United Kingdom’s proposed Age Appropriate Design Code defines a child as a person under the age of 18 and lays out additional requirements for information society services to build in privacy by design to better protect children online.

Prior to the GDPR, European data protection law did not include special protections for children, instead providing the same privacy protections across all age groups. The GDPR recognized that children are particularly vulnerable to harm and exploitation online and included provisions extending a higher level of protection for children. However, a universal consensus on the age of a child does not exist, and the flexibility provided by the GDPR creates a fragmented landscape of ages requiring parental consent across the EU. While complying with different ages of consent is relatively straightforward in the physical world where activities are generally limited within national boundaries, given the nature of online services operating across states, the lack of consistency of ages is a significant barrier for companies. Information society service providers are obliged to verify the age of a user, their nationality, and confirm the age of consent for children for that Member State prior to allowing access to their services. This burden may pose a competitive disadvantage for companies operating in the EU or result in measures depriving children and teens the benefits of using these services, as companies choose either to invest significant resources in age verification and parental consent mechanisms or to abandon the market for children and age gate their services instead.

The Commission also initiated a pilot project to create an infrastructure for implementing child rights and protection mechanisms online, which is scheduled to commence on January 1, 2021. The project aims to map existing age-verification and parental consent mechanisms both in the EU and abroad and assess the comprehensive mapping results to create “an interoperable infrastructure for child online protection including in particular age-verification and obtaining parental consent of users of video-sharing platforms or other online services.”

Currently, Member States require or recommend varying age verification and parental consent mechanisms. In addition to the UK’s Age Appropriate Design Code, the German youth protection law requires businesses to use scheduling restrictions to ensure that content harmful to children is not available during the day when children are online; to use technical methods to keep children from accessing inappropriate content, such as sending adults a PIN after age verification; or to use age labeling that youth protection software, downloaded by parents on their children’s devices, can read. However, the efficacy of these methods is unclear and unproven. As such, a sweeping review of existing methods may reveal best practices to be widely adopted within the EU and serve as a model for other countries, including the United States.