FPF Releases New Report on GDPR Guidance for US Higher Education Institutions

Today, FPF released The General Data Protection Regulation: Analysis and Guidance for US Higher Education Institutions by Senior Counsel Dr. Gabriela Zanfir-Fortuna. The new report contains analysis and guidance to assist United States-based higher education institutions and their edtech service providers in assessing their compliance with the European Union’s General Data Protection Regulation (GDPR).

The GDPR, which went into effect two years ago this week on May 25, 2018, grants individuals’ certain rights to control how their personal data is collected and used, and imposes steep penalties on organizations found to be noncompliant. When the GDPR came into effect, there was limited guidance and decisions available to help US higher education institutions and edtech companies in understanding their obligations. Now, two years into the regulation’s implementation, there is significant guidance that can be analyzed and applied. The law applies to most U.S.-based higher education and edtech companies, as these have some type of interaction with EU residents, whether students, faculty, alumni, or through study abroad programs or other initiatives.

The ten practical steps to begin a GDPR compliance program, detailed in the report“With this report, we hope to support higher education institutions and edtech companies to solidify trust in the way they are handling personal data of students, prospective students, and faculty, by implementing data practices that fully take into account privacy and data protection requirements”, said Dr. Zanfir-Fortuna, the author of the report, who is Senior Counsel for the Future of Privacy Forum.

Dr. Zanfir-Fortuna has written extensively about GDPR, including in two blog posts from fall 2019, “10 Reasons Why the GDPR is the Opposite of a Notice and Consent Type of Law,” and “Key Findings from the Latest Right to Be Forgotten Cases” as well as an in-depth report, produced by FPF in partnership with Nymity, on the use of “legitimate interests” as a lawful ground for data processing under GDPR. She is also a co-author of the GDPR Commentary published this year by Oxford University Press. Additionally, FPF published a comparison of the key differences between GDPR and the California Consumer Privacy Act (CCPA) to support organizations navigating compliance with both laws.

Amelia Vance, FPF’s Director of Youth & Education Privacy, cautioned that many U.S.-based institutions remain unprepared, despite the high stakes.

“As higher education institutions around the country navigate this unprecedented time, including a rapid transition to online learning and administration, it is critical to remain vigilant about data protection and privacy requirements,” said Vance. “An effective compliance program requires continuous attention and evolution. The consequences of losing sight of that now – potentially millions of dollars under GDPR – are significant, even during these uncertain times.”

The report includes a 10-step checklist with instructions for executing an effective GDPR compliance program. It is designed to assist both organizations with established compliance programs seeking to update or refresh their understanding of their obligations under GDPR, as well as those that are still in the process of creating or sustaining a compliance structure and seeking more in-depth guidance.

About FPF

The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.

Media Contact

[email protected]

Tech Talk with the Regulators – Understanding Anonymization Under the GDPR

The General Data Protection Regulation (GDPR) has already been in existence for four years, and has been in force for two years. How can anonymization techniques under the GDPR help Data Protection Officers (DPOs) assess innovation? I hosted a webinar with Truata that featured experts from DPAs in Italy, Ireland, and the UK to find out more about their perspective.

The recording is available here (link to the webinar).

‘A revision of the 2014 opinion on anonymization techniques is in the working program of the EDPB

In 2014, the European data protection authorities, assembled in the Article 29 Working Party provided guidance in their opinion on anonymization techniques. Giuseppe D’Acquisto, Senior Technology Advisor at the Italian Data Protection Authority, said that some adjustments to the 2014 guidance are needed because there are unexplored aspects of anonymization in the GDPR: “A revision of the 2014 opinion is in the working program of the EDPB.”

Ultan O’Carroll, Deputy Commissioner for Technology and Operational Performance at the Data Protection Commission in Ireland, said: “The 2014 opinion is still as valid as it ever was, if not more so.”

O’Carroll: “The principles and rights that we talked about in the 2014 opinion, and the risks that were identified – all of which have materialized in the GDPR – are particular about the importance of singling out, for instance. So, the guidance continues to be relevant and impactful, and outlines considerations for data controllers as they use and attempt to anonymize data.”

Unexplored aspects of anonymization in the GDPR’

D’Acquisto gave three examples where in his view the use of Privacy Enhancing Technologies (PETs) could play a role.

  1. On legitimate interest as a legal ground: “Anonymization techniques can become an element in the balancing test when you want to invoke legitimate interest.”
  2. On public interest as a legal ground: “Public interest is an opportunity when used in combination with national law.” He called on national legislators to explore the possibility of including the use of privacy-enhancing safeguards in laws.
  3. On the secondary, (in)compatible use of personal data for further processing: “Rethinking the 2014 opinion is useful to explore new opportunities for data controllers.”

D’Acquisto’s last remark came against the background of Recital 50 of the GDPR. It clarifies Article 6 of the GDPR which stipulates the lawfulness of processing. Recital 50 states that the processing of personal data for purposes other than those for which the personal data were initially collected should be allowed only where the processing is compatible with the purposes for which the personal data were initially collected. In order to ascertain whether a purpose of further processing is compatible with the purpose for which the personal data are initially collected, the controller should take into account the existence of appropriate safeguards in both the original and intended further processing operations. D’Acquisto stressed that value could be added to data in the interest of the public when applying anonymization techniques as safeguards for our rights and freedoms.

‘Time to focus on privacy risk management’

Simon McDougall, Executive Director for Technology Policy and Innovation at the Information Commissioner’s Office in the UK, said that it is time to focus on privacy risk management: “There is a tension between risk management and hard science. We can now quantify re-identification risk, for example. The problem is that most people do not understand risk. They struggle with the concept of residual risk and the question of what risk to accept.”

McDougall stressed that the challenge today is about communication: “We now have a better understanding of how to manage (residual) risks and bring them back to an acceptable level.”

He also explained the benefits of a layered approach to privacy risk management, rather than a focus on a single technology. “Think of it as a Swiss cheese notion of [stacked] risk management measures,” McDougall said. “A layered approach to control prevents risk from passing through various risk mitigation measures because the holes do not line up.”

While it is important to keep up with the cutting edge of anonymization technologies in order to understand the full scope of possibilities, McDougall implored the audience to “look at the basics, not just the cutting-edge material.”

‘Legal and technical competences are complementary to each other’

From the discussion, it became clear that to move the use of anonymization techniques forward, Data Protection Officers (DPOs) have an important role to play. The broader questions around innovation, sharing of data, and repurposing of data have become particularly important in the context of COVID-19. Accordingly, each of the experts expressed their advice for DPOs given the developments in anonymization technologies.

D’Acquisto suggested that DPOs should not rely on either legal or technical competence alone. “Both competencies are important in order to tackle the complex aspects of anonymization techniques,” he said. “If we look at one, we miss the opportunities of the other. Each is complementary to the other. A holistic approach is needed with legal safeguards, technical safeguards, and a path toward compliance.”

‘DPOs: do not go alone; get help’

O’Carroll added that it is essential that DPOs not act in isolation. “DPOs need to get access to scientists and to organizational people, but also to expert advice in terms of social science, cognitive science, interface design, or mathematics, for example. Do not go alone; get help,” he said. “And be sure that what you’re presenting is robust. Take your time to do that. It’s not worth carrying forward without that because you’ll be asked questions that you may not think about.”

In closing, McDougall remarked that “DPOs should think about themselves as intelligent customers. Anonymization technologies are very complex. A DPO should be able to have conversations with experts. Instead of thinking this is all incredibly complicated, they should try to understand what the risks are for the individual and the organization. But it is possible to follow, to understand the principles, to understand that the levels of risk and the technology itself are changing all the time. It is possible to keep up with it so you can then have the conversation with the right expert.”

 

To learn more about FPF in Europe, please visit fpf.org/eu.

 

Apple & Google Update Terms for COVID-19 Apps

While the legislative process for COVID-related data is advancing, the business and platform standards for apps are falling in place.

Rules for All COVID Apps

In response to the coronavirus pandemic, developers are rushing to create applications to support healthcare systems, spread awareness in the community, guide epidemiological research, and mitigate the spread. Apple and Google have reemphasized their existing policies in app development, data protection, and appropriate content, and have bolstered requirements with direct specifications regarding COVID-19 apps.

The new Bluetooth Exposure APIs created by Apple and Google have received a significant amount of attention. Due to the expanded access to Bluetooth scanning enabled by these APIs, both companies have limited use of these APIs to apps provided by health departments. Apple and Google require that the apps must be voluntary, cannot collect location information, and only gather very limited additional user information. For more information about the Apple/Google Exposure Notification API, read more about the details provided by Apple and Google.

However, new App Store and Google Play terms updated by the companies also set new rules for ALL COVID-related apps. 

Apple’s COVID-19 App Requirements

In a March 14 post, Apple explained that it would evaluate COVID-19 apps critically, ensuring that data sources are reputable. 

A COVID-19-specific app will only be considered if developers represent recognized entities such as “government organizations, health-focused NGOs, companies with clear credentials in health issues, and medical or educational institutions. Any entertainment or game apps with COVID-19 as their theme will be prohibited.” 

In addition to these new requirements, Apple’s updated policy also describes relevant limitations on certain types of apps. 

  1. According to a new section 5.1.1.ix., apps in “highly-regulated” fields, such as healthcare, financial services, and air travel, need to be submitted by a “legal entity that provides the services, and not by an individual developer.” 
  2. Safety consistently remains a top priority for Apple’s restrictions. Section 1.4.1 prohibits medical apps from providing inaccurate data or information. Apple’s team will utilize greater scrutiny when evaluating apps that could be used for diagnosing or treating patients. Developers must be reachable by users to respond effectively to questions and support issues.
  3. Privacy policies in the App Store call for transparency about all data collection, retention, and sharing. If the app collects health-related data, this personal data may not be used for advertising, nor can the app store this information in iCloud. However, this information may be used to improve health management or support relevant research if the user (or guardian of a minor) consents. Consent requirements are as follows:
    1. nature, purpose, and duration of the research
    2. procedures, risks, and benefits to the participant
    3. information about confidentiality and handling of data (including any sharing with third parties)
    4. a point of contact for participant questions
    5. the withdrawal process
  4. Apps conducting health research must obtain permission from an independent ethics review board. 

Just like any application available in the App Store, many preexisting relevant restrictions continue to apply to new apps related to COVID-19. 

Google’s COVID-19 App Requirements

Google also posted updated guidelines for COVID apps, which it defined as follows:

Apps that are subject to these requirements include, but may not be limited to:

  1. Apps that use, approximate or leverage coronavirus, COVID-19, pandemic, or related keywords in their Google Play Store listing metadata. 
  2. Apps that provide medical, treatment, vaccine, testing, or other related information specifically for COVID-19.
  3. Apps that support COVID-19-related response, containment, research, or education/training efforts.
  4. Apps that support services used to respond specifically to COVID-19, for example, apps that provide social support (food stamps, payment), healthcare, loans, etc., specifically in response to COVID-19.

The new Google Play rules state that only the following categories of apps are eligible to use COVID-19 or other related keywords and marketing in their Google Play Store app listing:

  1. Official governmental apps, which connect users to authoritative information and services. 
  2. Apps published by, or in direct affiliation with:
    1. a healthcare system or provider (e.g. CVS Health, UK National Health Service, UnitedHealth Group, Kaiser Permanente, French national healthcare system, Netcare (South Africa), One Medical, etc.); 
    2. a nationally recognized medical or epidemiological research organization deeply rooted in medical research (including nationally recognized medical schools). (The medical or epidemiological research organization or government research, should have approval from a registered governing body (for example, Institutional Review Board in the US, or the National Health Service (NHS) in the UK). In case of dispute, a local or national government, or verifiable healthcare non-governmental organization (NGO) endorsement will be required.) or; apps directly endorsed by an official. The app must be directly published by or in direct partnership with one of the entities (e.g. the authorizing institution or organization is referenced, with full permission, in the app’s title, logo, or Google Play Store description). Endorsement by a non-government entity alone does not meet the qualification (e.g., an app endorsed by staff at a medical school would not qualify if that app is not published by or in direct partnership with the medical school).

Apps specifically created in response to COVID-19 may not access personal and sensitive data that is not required to directly support pandemic response efforts and epidemiological research. The privacy policy must outline this data use.  COVID-19 apps that handle personal user data must also strictly comply with other existing Google Play Policy requirements

Google also reiterated its restrictions to content regarding sensitive events, misrepresentation, and deceptive behavior. COVID-19 apps cannot contain unverifiable information that counteract efforts of community education and relief.

New Infographic Illustrates Key Aspects of Location Data

Today, the Future of Privacy Forum (FPF) published an infographic, “The World of Geolocation Data” that outlines how location data is generated from mobile devices, who has access to it, and factors to consider in evaluating privacy risks. Data from our mobile devices, including smartphones and fitness trackers, can serve as a proxy for where we are located over time, revealing intimate information about individuals and groups.

“During the COVID-19 pandemic, many are interested in employing both location data and proximity signals from smartphones to track the spread of the virus and measure adherence to social distancing guidelines,” said Stacey Gray, FPF Senior Counsel. “We’re helping policymakers and public health officials understand location data so they can make proactive, knowledgeable choices about the use of this sensitive information.”

The infographic shows how mobile devices interpret signals from Wi-Fi and Bluetooth networks, cell towers, and GPS satellites to pinpoint their location, as well as how that data is analyzed by the mobile operating system to provide precise measurement to mobile apps upon request. The graphic describes the different entities that are able to access, use, or share various types of location data, including cell phone carriers, mobile apps and app partners, and downstream recipients. Finally, the graphic describes the factors that make location data more or less risky including persistence and frequency, precision, accuracy, known or sensitive locations, and the use of de-identifying technologies. 

Stacey Gray, Senior Counsel at FPF and the author of the infographic, will host a webinar to help policymakers better understand the complicated ecosystem for device location data on Tuesday, June 2nd at 12 PM EDT. The webinar will include an expanded discussion of the infographic, will answer questions about evaluating and mitigating risks in real-world location datasets, and will feature technical and legal experts, including Shane Wiley, CPO of Cuebiq; Kara Selke, VP of Commercial Development and Privacy at Streetlight Data; as well as Chelsey Colbert, Policy Counsel at FPF and Dr. Rob van Eijk, FPF’s Managing Director for Europe. To register for the event, click here.

Other recently-published resources from FPF related to privacy and the coronavirus pandemic include: 

The full list of FPF’s privacy and pandemics resources can be accessed on the FPF website at fpf.org/privacy-and-pandemics.

About FPF

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting fpf.org. 

Understanding the "World of Geolocation Data"

How is location data generated from mobile devices, who gets access to it, and how? As debates over companies and public health authorities using device data to address the current global pandemic continue, it is more important than ever for policymakers and regulators to understand the practical basics of how mobile operating systems work, how apps request access to information, and how location datasets can be more or less risky or revealing for individuals and groups. Today, Future of Privacy Forum released a new infographic, “The World of Geolocation Data” that explores these issues.

In this infographic, we demonstrate how mobile devices, such as smartphones, interpret signals from their surroundings – including GPS satellites, cell towers, Wi-Fi networks, and Bluetooth – to generate a precise location measurement (latitude and longitude). This measurement is provided by the mobile operating system to mobile apps through a Location Services API when they request it and receive the user’s permission. As a result, apps must comply with the technical and policy controls set by the mobile operating systems, such as App Store Policies.

Many different entities (including, but not limited to mobile apps) provide location features or use location data for a variety of other purposes. Different entities are subject to different restrictions, such as public commitments, privacy policies, contracts and licensing agreements, user controls, app store policies, and sector-specific laws (such as telecommunications laws for mobile carriers). In addition, broadly applicable privacy and consumer protection laws will generally apply to all commercial entities, such as the California Consumer Privacy Act, or the Federal Trade Commission Act (FTC Act).

Finally, in addition to legal and policy controls, location datasets can be technically modified to further mitigate risks to individuals and groups. Some of those practical mitigation steps might include:

Future of Privacy Forum Partners with Dublin City University

Today, the Future of Privacy Forum (FPF) and Dublin City University (DCU) have announced a new partnership that will see them host joint conferences and workshops, collaborate on research projects, develop resources for policymakers, and pursue applications for research opportunities together over the next three years.

“Partnering with DCU will allow us to collaborate with some of the world’s leading experts on AI and other innovative technologies to ensure data protection, privacy and ethics remain a priority for research and new products.,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “FPF is expanding its presence in Ireland because individuals in the US and EU share common values about both privacy and data protection challenges as well as the opportunities data enables to make our lives better.”

DCU is home to some of the leading AI-focused research and scholarship programs in Ireland. DCU is a lead university for the Science Foundation Ireland ADAPT program, and hosts the consortium leadership for the INSIGHT research centre, two of the largest government funded AI and tech-focused development programs.

“Our partnership with the Future of Privacy Forum will be a valuable asset as DCU helps craft the strategy to keep Ireland a global leader in developing artificial intelligence and other technologies,” said Professor Lisa Looney, Executive Dean of the Faculty of Engineering and Computing at DCU. “Leaders in government and in industry respect FPF for its expertise on the best approaches to balance individual privacy and the benefits of new technology applications.”

FPF will be partnering with DCU on a proposal for a SFI Industry-Academia project on data governance with tech platforms and SFI research centers across Ireland. FPF and The Faculty of Engineering and Computing also plan to engage in joint research via EU funding, student projects and national funding like SFI ADAPT and INSIGHT research centers. Engineering and Computing launched a campus-wide Ethics and Privacy week event this year and will work with FPF to make this an annual event and extend its reach to undergraduates across all disciplines as well as the DCU research community.

FPF has built strong partnerships across Europe through its convening and trainings for policymakers and regulators. To learn more about FPF’s EU work, head to fpf.org/eu.

CONTACT

Nat Wood

[email protected]

(410) 507-7898

About FPF

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting fpf.org. 

About Dublin City University

DCU is Ireland’s fastest growing university. It has seen its student population increase by 50% in the past five years, to over 17,500 students. It has forged a reputation as Ireland’s university of enterprise, through its strong, active links with academic, research, and industry partners both at home and overseas.

DCU has five faculties incorporating 23 schools spread across three academic campuses, located in Glasnevin and Drumcondra, on the Northside of Dublin City. 

DCU develops highly sought-after, well-rounded graduates who are ready for the workforce and eager to apply their knowledge and skills in a broad range of settings. For more information, please visit www.dcu.ie

FPF CEO: Will I Install an Exposure Notification App? Thoughts on the Apple-Google API

As a privacy expert, if my local health department develops a mobile app for people with a COVID diagnosis to alert anyone they were near, will I use it?

Yes, I will. And I will urge friends, neighbors and colleagues to download such an app. I have an immuno-compromised family member in my household. I am also lucky to live one block from my senior citizen in-laws. If a health department app can inform me that I am possibly at risk, I can take measures to keep them safe from me. I want that app to be built with privacy protections in place, collecting only the data needed and deleting it as soon as possible. Today, Apple and Google have launched new capabilities for health department apps, with strict technical privacy restrictions to try to provide these apps with the ability to scan for nearby devices and to delete data in 30 days.

In my home state of Maryland, Governor Hogan is seeking to quadruple the current staffing to 1,000 state employees and outside contractors supporting manual contact tracing, but hiring and training will take time. Contact tracing relies on interviewing people about who they may have come into contact with recently and then painstakingly finding contact information needed to contact everyone of those potentially exposed individuals. It also relies on people to accurately remember all of their interactions. Can you remember the people you stood next to on the long line at the grocery store last week?

Should my health department offer an app to supplement this process? I hope they will look closely at the way apps have been used by health departments for exposure notification around the world and decide whether it would be a useful supplement to the human contact tracing effort they are setting up.

In an ideal world, we would have a national response that deployed hundreds of thousands of human contact tracers, so that use of an app would be a very minor supplemental option. Exposure notification apps would be tested for efficacy in a careful controlled study. The CDC would be working with the WHO to advise based on the results of studies of the app efforts in Singapore, Israel, Hong Kong, South Korea and elsewhere. We might learn if they are helpful and what data they need. Do health department apps need precise location, despite the risks of revealing the private activities of individuals? Can the apps rely solely on information from Bluetooth about proximity to nearby phones to be effective? Are the apps effective if they are voluntary and work in a decentralized manner? What is the risk of abuse of data collected in countries without strong data protection legislation or countries with dangerous human rights records? But we do not live in a perfect world, and timely preventive measures can save lives today.

I realize that the data may be imprecise, untested, imperfect. I will look to my reasonably competent health department for guidance. I realize I am privileged in this regard. If I get an alert, I can work from home and be paid. I can err on the side of safety out of caution. Many can not. I realize that not everyone has a smartphone, so this is not a service that all can benefit from, but it is one of the most widely adopted technologies in the world. I hope we can find ways to ensure everyone can have access and that we can address economic and racial disparities.

I vote, donate and actively campaign for candidates who I hope will work to make society more just. I have served in government at the city, state and federal level and have been elected to office and have been appointed to office. But in an imperfect world and during an emergency, we all need to make the most ethical decision with the facts at hand. Relying on such apps is in my view a potentially helpful supplemental safety measure that fills a gap created by the current challenges.

Let’s turn to what Apple and Google should be doing to support local health departments. First, let’s note that Apple and Google haven’t invented the idea of using a phone for exposure notification or contact tracing during this pandemic. Health departments in countries that moved quickly to respond to the outbreak quickly commissioned apps that used the mobile phone location services, and sometimes Bluetooth capabilities and promoted them to their local populations. But it turns out that due to privacy settings and power limitations, mobile phones aren’t the most effective tool for the highly precise information collection needed for tracing. These privacy protections have been baked deep into the devices operating system, due to years of work to prevent misuse by human rights abusing governments, stalkers and criminals and by advertisers and marketers.

Another current interoperability problem that the Google-Apple API will solve for is that existing exposure notification apps are often not interoperable with each other. If a person downloads an app from one public health authority but then comes into contact with a user of an app from another jurisdiction, the apps often will not recognize one another. However, all apps using the Apple-Google API will recognize one another. This type of scalability is essential to enable effective notifications, thereby beginning to enable society to cautiously reopen.

These are the limitations that public health authorities are facing in developing apps. The apps that have launched to date have usually relied on asking users to opt in to sharing their location, revealing precise location data can reveal intimate information – where you’re going, where you’ve been, your character, interests, habits, religion, political inclinations.

So health departments began looking to Google and Apple to give them better access to the limited bluetooth APIs currently available. Remarkably, for two competitors who rarely cooperate, Apple and Google partnered on providing a new API that allows background sending and receiving of rotating Bluetooth identifiers. This gives apps access to new information that they couldn’t get before, but with limits to how it can be accessed or used. Only health departments will be approved to use this new API, to limit the sending of fake signals. Health departments are not sent information about individual users, as the app and device handles the communications locally.

Apple and Google did not create an app. It’s an API, which means a technical method for apps to get information off of the device. Public health authorities will create the apps that use this information, and be responsible for how it is communicated and how users receive alerts and what those alerts say. Public health authorities will have options to determine who should be alerted based on Bluetooth signal strength and time period of proximity to trigger an alert.

Now here is where it gets complicated. Some health departments want to use the new API and also collect location data, creating a risk that users can be identified. Some health departments want to create centralized databases to help them track and analyze the data collected. These health departments want Google and Apple to change their APIs and terms of use for the apps to allow collection of more personal data from users. But any changes made to the API or terms will affect users in every country in the world, creating risks that governments could misuse the API for law enforcement or for human rights abuses. Some privacy advocates think that even the current limited to Bluetooth apps can create a security risk. Some think that local democratic governments should set the privacy rules, not tech companies. Most average users will have a difficult time understanding the important differences between location and proximity. There is some truth to everyone one of these points, and no option that doesn’t have some downside.

But, if you are like me, and you want to protect those around you by being able to get and share these alerts, with minimal risk to privacy, health department apps that use the new API should be able to provide an additional tool in the effort to re-open society as we fight the pandemic.

For more privacy and data protection resources related to COVID-19, click here.

FPF Honors UC-Irvine/Lumos Labs Partnership with First-Ever Award for Research Data Stewardship

Click here to view the Call for Nominations for the 2021 FPF Award for Research Data Stewardship.

Click here to watch a recording of the 2020 FPF Award for Research Data Stewardship virtual awards event.

University of California Irvine (UCI) Professor of Cognitive Science Mark Steyvers and Lumos Labs – the parent company behind Lumosity, a popular online brain-training game website – are the winners of the first-ever Award for Research Data Stewardship from the Future of Privacy Forum (FPF). The award-winning collaboration between Professor Steyvers and Lumos Labs employed privacy techniques to transform data on user play into innovative cognitive science research. The annual FPF Award for Research Data Stewardship is supported by the Alfred P. Sloan Foundation, a not-for-profit grantmaking institution that supports high-quality, impartial scientific research and institutions. 

Lumosity supplied de-identified data on users’ response time and accuracy from one Lumosity game to researchers interested in identifying how people flexibly and efficiently adapt their behavior in response to changing contexts, otherwise known as task switching. In order to ensure that the data sharing project minimized potential privacy risks, both the parties took a number of steps, including: 

“Independent research on consumer data collected by private companies holds the keys to addressing many of the challenges facing our society today, but it must be done in a way that protects individual privacy,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “The COVID-19 pandemic has highlighted the urgency of promoting privacy-protective means of conducting research. That’s exactly what we’re doing by honoring Professor Steyvers and Lumos Labs as the winners of the Award for Research Data Stewardship.”

Nominees for the Award for Research Data Stewardship were judged based on their adherence to privacy protection in the data sharing process, the quality of the data handling process, and the company’s commitment to supporting academic research. Nominations were reviewed by a jury of experts comprised of academic and industry thought leaders, including representatives from FPF, leading foundations, academics, and industry leaders. Establishing data protections for corporate-academic data sharing is increasingly important as governments, healthcare institutions, and researchers aim to obtain and deploy consumer data to track the spread of the coronavirus, deliver emergency supplies, target travel restrictions and quarantines, and develop vaccines and cures. 

The partnership between Lumos Labs and Professor Steyvers was created through the Human Cognition Project (HCP), which is an online platform that was made to facilitate large-scale, collaborative research studies led by independent academic and clinical researchers. Over the last decade, the HCP has supported over 100 collaborators from universities and organizations, resulting in more than 40 peer-reviewed publications. 

“The Human Cognition Project as a whole, and the collaboration with Professor Steyvers in particular, demonstrates our commitment to sharing our data with academic researchers in a manner that respects individual privacy,” said Bob Schafer, General Manager of Lumos Labs. “Protecting the individual privacy of our users while using data and research to make the world a better place is at the heart of what we do at Lumos Labs.”

“The research collaboration with Lumos Labs enabled me to access the right data, without fear of compromising individual privacy,” said Mark Steyvers, Professor at University of California Irvine. “Through the Human Cognition Project, I was able to access large-scale data sets that enabled more extensive and precise investigations of human learning than is typically achievable conducting tests in a laboratory.” 

The partnership resulted in the publication of research in a leading journal that advances the research field’s understanding of an important cognitive function – task switching – and the impact of practice. The partnership has also provided resources and tools to the larger research community to promote transparency and reproducibility of results and has democratized this type of “big data” approach to the cognitive sciences. 

In addition to the award winners, FPF announced several nominated projects that earned honorable mentions, including: 

Learn more about the project, including best practices for future data sharing collaborations on the FPF website.

CONTACT

Nat Wood

[email protected]

(410) 507-7898

Newly Released COVID-19 Privacy Bills Would Regulate Pandemic-Related Data

By Pollyanna Sanderson (Policy Counsel), Stacey Gray (Senior Policy Counsel) & Katelyn Ringrose (Christopher Wolf Diversity Law Fellow)

Yesterday afternoon, leading House and Senate Democrats introduced the Public Health Emergency Privacy Act. The Democratic-led bill, which was introduced by Senators Blumenthal and Warner, as well as Representatives Eshoo, Schakowsky and DelBene, follows the May 10th introduction of a similar COVID-19 data protection bill by leading Senate Republicans. Although the bills are similarly broad in scope and substantively robust, they contain a few important differences. 

Both the Democratic-led and the Republican-led COVID-19 privacy bills introduced so far are motivated by an urgent need to build public trust in the use of personal data to address the current pandemic. For example, recent research shows a marked lack of trust among the American population when it comes to their digital privacy amid the COVID-19 pandemic.

Below, we summarize the Public Health Emergency Privacy Act’s (1) scope of covered data and entities; (2) legal requirements; and (3) a few key differences from its Republican counterpart. 

BROAD SCOPE OF COVERED DATA

The Democratic-led Public Health Emergency Privacy Act would create new substantive obligations for a broad range of covered entities processing data to address COVID-19–both public and private, including non-profits and employers with respect to data collected about their employees. 

The Act would apply to:

LEGAL REQUIREMENTS

The Act contains a variety of blanket prohibitions (such as a prohibition on using COVID-19 data for commercial purposes), as well as a few affirmative obligations (such as reporting) on companies, non-profits, and other covered entities.

Covered entities would be prohibited from:

Covered entities would be required to: 

The Act includes a broad research exemption for public health or scientific research associated with COVID-19 when such research is carried out by a public health authority, nonprofit organization, or an institute of higher education. Furthermore, the Act would not prohibit research, development, manufacturing, or the distribution of COVID-19 related drugs or vaccines

The Act does not preempt state laws, and includes a private right of action with tiered remedies according to whether the violation is negligent ($100-$1,000), or reckless, willful or intentional ($500-$5000).

COMPARISON TO SENATE REPUBLICANS’ COVID-19 PRIVACY BILL

Last week, Senator Roger Wicker, the Republican Chairman of the Senate Commerce Committee, introduced a similarly broad privacy bill with leading Senate Republicans, the COVID-19 Consumer Data Protection Act of 2020

The two bills contain many similarities, including a requirement that covered entities obtain “affirmative express consent” to collect or process COVID-19 data, a requirement for recurring deletion, and a data minimization requirement that data should not be collected beyond what is necessary and proportionate to public health needs. 

We observe a few key differences between the Republican-led bill and this week’s Democratic-led bill:

As noted, there are some significant differences between these two proposals. We expect additional bills to emerge, as additional legislators set forward ideas to address COVID data issues, including some that may be more narrowly tailored to specific use cases. And, as the HR Policy Association recently pointed out, hundreds of current local labor and employment laws and regulations are currently applicable to COVID-related activities.   

In an op-ed this week calling for legislation, Commissioner Christine Wilson quoted the words of Samuel Johnson: “When a man knows he is to be hanged in a fortnight, it concentrates his mind wonderfully.” We hope the pressure to pass legislation during this crisis can bridge the political divides in Congress, but we also hope legislators appreciate the ongoing urgency of broad comprehensive data protection legislation.

FPF Charts DPAs’ Priorities and Focus Areas for the Next Decade

The Future of Privacy Forum (FPF) today released a white paper, New Decade, New Priorities: A summary of twelve European Data Protection Authorities’ strategic and operational plans for 2020 and beyond, that provides guidance on the priorities and focus areas that are considered top concerns amongst European Data Protection Authorities (DPAs) for the 2020s and beyond. 

DPAs across the European Union (EU) are in a unique position to shape the future of digital services and how they impact individuals and societies both through their outstanding enforcement powers and through their policymaking. To address the complexities of digital services and individual rights in the new decade and beyond, several DPAs have published strategic and operational plans, and have set new data protection policy goals to meet these challenges head-on. 

Co-authors Charlotte Kress, Rob van Eijk, and Gabriela Zanfir-Fortuna of FPF reviewed twelve publicly available strategic plans, roadmaps, and outlines to identify the top priorities and focus areas of DPAs during the coming decade and beyond. The authors also reviewed recently-released DPA guidance regarding COVID-19.

Their findings indicate that both the local DPAs and the EDPB are concentrating on guidelines for the consistent application of the GDPR, which aligns with ongoing harmonization efforts across the EU and the European Economic Area (EEA), aiming to:

  1. clarify how (relatively) recent technologies and business practices should operate under the GDPR;
  2. prepare for the implications and proliferation of newer technologies, such as artificial intelligence and automated decision-making; and
  3. protect those most vulnerable to the risks of data use practices such as data profiling.

National DPAs identified key topic areas as focus points for enforcement actions arising from DPAs’ “own motion,” such as advertising & marketing, health, and banking & finance. In addition, DPAs’ strategies most commonly enumerated policy-related topics such as artificial intelligence and children & youth privacy.

The summary of findings is a vital resource for understanding how European data protection and privacy law, enforcement, and policy will take shape in the years to come. The inclusion of COVID-related strategies and priorities provides a holistic view of what has become the new, unexpected focus area of DPAs across the continent.   

Read the Full Report Here