South Korea: The First Case Where the Personal Information Protection Act was Applied to an AI System

As AI regulation is being considered in the European Union, privacy commissioners and data protection authorities around the world are starting to apply existing comprehensive data protection laws against AI systems and how they process personal information. On April 28th, the South Korean Personal Information Protection Commission (PIPC) imposed sanctions and a fine of KRW 103.3 million (USD 92,900) on ScatterLab, Inc., developer of the chatbot “Iruda,” for eight violations of the Personal Information Protection Act (PIPA). This is the first time PIPC sanctioned an AI technology company for indiscriminate personal information processing.

“Iruda” caused considerable controversy in South Korea in early January after complaints of the chatbot using vulgar and discriminatory racist, homophobic, and ableist language in conversations with users. The chatbot, which assumed the persona of a 20-year-old college student named “Iruda” (Lee Luda), attracted more than 750,000 users on Facebook Messenger less than a month after release. The media reports prompted PIPC to launch an official investigation on January 12th, soliciting input from industry, law, academia, and civil society groups on personal information processing and legal and technical perspectives on AI development and services.

PIPC’s investigation found that ScatterLab used KakaoTalk, a popular South Korean messaging app, messages collected by its apps “Text At” and “Science of Love” between February 2020 to January 2021 to develop and operate its AI chatbot “Iruda.” Around 9.4 billion KakaoTalk messages from 600,000 users were employed in training algorithms to develop the “Iruda” AI model, without any efforts by ScatterLab to delete or encrypt users’ personal information, including their names, mobile phone numbers, and addresses. Additionally, 100 million KakaoTalk messages from women in their twenties were added to the response database with “Iruda” programmed to select and respond with one of these messages.

With regards to ScatterLab employing users’ KakaoTalk messages to develop and operate “Iruda,” PIPC found that including a “New Service Development” clause in the terms to log into the apps “Text At” and “Science of Love” did not amount to user’s “explicit consent.” The description of “New Service Development” was determined to be insufficient for users to anticipate that their KakaoTalk messages would be used to develop and operate “Iruda.” Therefore, PIPC determined that ScatterLab processed the user’s personal information beyond the purpose of collection.

In addition, ScatterLab posted its AI models on the code sharing and collaboration platform Github from October 2019 to January 2021, which included 1,431 KakaoTalk messages revealing 22 names (excluding last names), 34 locations (excluding districts and neighborhoods), gender, and relationships (friends or romantic partners) of users. This was found to be in violation of PIPA Article 28-2(2) which states, “A personal information controller shall not include information that may be used to identify a certain individual when providing pseudonymized information to a third party.”

ScatterLab also faced accusations of collecting personal information of over 200,000 children under the age of 14 without parental consent in the development and operation of its app services, “Text At,” “Science of Love,” and “Iruda,” as its services did not require age verification prior to subscribing.

PIPC Chairman Jong-in Yoon highlighted the complexity of the case at hand and the reasons why extensive public consultation took place as part of the proceedings: “Even the experts did not agree so there was more intense debate than ever before and the ‘Iruda’ case was decided after very careful review.” He explained, “This case is meaningful in that it has made clear that companies are prohibited from indiscriminately using personal information collected for specific services without clearly informing and obtaining explicit consent from data subjects.” Chairman Yoon added, “We hope that the results of this case will guide AI technology companies in setting the right direction for the processing of personal information and provide an opportunity for companies to strengthen their self management and supervision.”

PIPC plans to be active in supporting compliant AI Systems

PIPC also stated that it seeks to help AI technology companies in improving their privacy capabilities by having AI developers and operators present a “Self-Checklist for Personal Information Protection of AI Services” on-site, as well as support on-site consulting. PIPC plans to actively support AI technology companies to develop AI and data-based industries while protecting people’s personal information.

ScatterLab responded to the decision, “We feel a heavy sense of social responsibility as an AI tech company regarding the necessity to engage in proper personal information processing in the course of developing related technologies and services,” and stated that, “Upon the PIPC’s decision, we will not only actively implement the corrective actions put forth by the PIPC but also work to comply with the law and industry guidelines related to personal information processing.”

Talking to Kids About Privacy: Advice from a Panel of International Experts

Now more than ever, as kids spend much of their lives online to learn, explore, play, and connect, it is essential to ensure their knowledge and understanding of online safety and privacy keeps pace. On May 13th, the Future of Privacy Forum and Common Sense assembled a panel of youth privacy experts from around the world for a webinar presentation on, “Talking to Kids about Privacy,” exploring both the importance of and approaches to talking to kids about privacy. Watch a recording of the webinar here.

The virtual discussion, moderated by FPF’s Amelia Vance and Jasmine Park, aimed to provide parents and educators with tools and resources to facilitate productive conversations with kids of all ages. The panelists were Rob Girling, Co-Founder of strategy and design firm Artefact, Sonia Livingstone, Professor of Social Psychology at the London School of Economics and Political Science (LSE), Kelly Mendoza, Vice President of Education Programs at Common Sense, Anna Morgan, Head of Legal and Deputy Commissioner of the Irish Data Protection Commission (DPC), and Daniel Solove, Professor of Law at George Washington University Law School and Founder of TeachPrivacy.

The first thing that parents and educators need to know? “Contrary to popular opinion, kids really care about their privacy in their personal lives, and especially now, in their digital lives,” shared panelist Sonia Livingstone. “When they understand how their data is being kept, shared, monetized and so forth, they are outraged.” To help inform youth, Livingstone curated an online toolkit with young people to answer frequently asked privacy questions that emerged from her research. 

And a close second: their views about privacy are closely shaped by their environment. “How children understand privacy is in some ways colored by the part of the world they come from and the culture and ideas about family and ideas about institutions that they can trust, and especially how far the digital world has already become something they rely upon,” Livingstone added.

Kelly Mendoza encouraged audience members to start having conversations about privacy with kids at a young age, and to get beyond the common but too simple advice to not share personal information online. Common Sense’s Digital Citizenship Curriculum provides free lesson plans to address timely topics and prepare students to take ownership of their digital lives by grade and topic. 

She also emphasized the important role that schools play in educating parents about privacy in her remarks. “It’s important that schools and educators and parents work together because really we’re finding that schools can play a really powerful role in educating parents,” Mendoza said. “Schools need to do a better job of communicating – what tools are they using? How are they rated and reviewed? What are the privacy risks? And why are they using this technology?” A useful starting point for schools and parents is Common Sense’s Family Engagement Resources Toolkit, which includes tips, activities, and other resources.  

Several panelists emphasized the critical role schools play in educating students about privacy. To do so effectively, schools engage and educate teachers to ensure they are informed and equipped to have meaningful conversations about privacy with their students. 

Anna Morgan provided a model for engaging children in informing data protection policies through classroom-based lesson plans. Recognizing that the General Data Protection Regulation (GDPR) and Data Protection Law are complex, the DPC provided teachers with a quick start guide to provide background knowledge, enabling them to engage in discussions with children about their data protection rights and entitlements. 

Privacy can be a difficult concept to explain, and there’s nothing quite like a creative demonstration to bring privacy concerns to life. One example: the DPC created a fictitious app to solicit children’s reactions to the use of their personal data. Through their consultation, Morgan shared that 60 percent of the children surveyed believed that their personal data should not be used to serve them with targeted advertising, finding it scary and creepy to have ads following them around. A full report from the consultation can be found here.

Daniel Solove also highlighted the need for educational systems to teach privacy. “Children today are growing up in a world where massive quantities of personal information are being gathered from them. They’re growing up in a world where they’re more under surveillance than any other generation. There’s more information about them online than any other generation. And the ability for them to put information online and get it out to the world is also unprecedented,” Solove noted. “So I think it’s very important that they learn about these things, and as a first step, they need to appreciate and understand the value of privacy and why it matters.”

One way for kids to learn about privacy is through storytelling. Solove recently authored a new children’s book about privacy titled, THE EYEMONGER, and shared his motivations for writing the book with the audience. “There really wasn’t anything out there that explained to children what privacy was, why we should care about it, or really any of the issues that are involved in this space, so that prompted me to try to do something about it.” He also compiled a list of resources to accompany the book and help educators and parents teach privacy to their children.

Building on the thread of creating outside-the-box interactive experiences to help kids understand privacy, Rob Girling shared with the audience a game called The Most Likely Machine, developed by Artefact Group to help preteens understand algorithms. Girling saw a need to teach algorithmic literacy given the impact on children’s lives, from determining college and job applications to search engine results. For Girling, “It’s just starting to introduce the idea that underneath algorithms are human biases and data that is often biased. That’s the key learning we want kids to take away.”

Each of the panelists shared a number of terrific resources and recommendations for parents and educators, which we have listed and linked to below, along with a few of our own.

Watch the webinar in full here, and we hope you will use and share some of the excellent resources referenced below.

Rob’s Recommended Resources

Sonia’s Recommended Resources

Kelly’s Recommended Resources

Anna’s Recommended Resources

Dan’s Recommended Resources

Additional Future of Privacy Forum Resources of note:

FPF Ethical Data Use Committee will Support Research Relying on Private Sector Data

FPF has launched an independent ethical review committee to provide oversight for research projects that rely upon sharing of corporate data with researchers. Whether researchers are studying the impact of platforms on society, supporting evidence based policymaking, or understanding issues from COVID to climate change, personal data held by companies is increasingly essential to advancing scientific knowledge.

Companies want to be able to cooperate with researchers to use data and machine learning tools to drive innovation and investment, while ensuring compliance with data protection rules and ethical guidelines. To accomplish this, some companies are ramping up their internal ethical knowledge base and staff. However, reviewing high-risk, high-reward analytics projects in-house can be expensive, complex, and may lead to accusations of favoritism or ethics-washing. Traditional academic IRBs may consider the corporate data previously collected for business uses to be out of scope of their review, creating a gap for independent expert ethical review.

Many of the projects that seek to expand human knowledge rely on insights derived from combinations of data and use of machine learning or other advanced data analysis techniques. Sharing data for research drives innovation but it may also create novel risks that must be responsibly considered.

The FPF Ethical Data Use Committee (EDUC) provides companies and their research partners with ethics review as a service. The EDUC will provide an independent expert review of proposed research data uses to help companies limit the risks of unintended outcomes or data-based discrimination. The committee also will help researchers ensure ethical alignment with their uses of secondary data. As part of the review, the committee will provide specific recommendations for companies and researchers to implement that could mitigate the identified risks of individual and group or social harms. These reviews are particularly useful for many uses of data, including for machine-learning based research, models or systems.

The Committee – designed and developed with the generous support of Schmidt Futures and building on previous FPF work funded by the Alfred P Sloan Foundation and the National Science Foundation – will include experts from a range of disciplines, including academic researchers, ethicists, technologists, privacy professionals, lawyers, and others. They will complete training on data protection and privacy, AI and analytics, applied ethics, and other topics in addition to their own expertise, to serve terms on the Committee. Technical specialists will also be tapped for guidance on specific topic areas as required.

At this time, the Ethical Data Use Committee is preparing for final user-preference pilot testing. We are soliciting partners who aspire to be the first to use this system under cost conditions that will not be available once the review committee becomes fully operational. Companies and researchers participating in this final testing phase can do so confidentially, at no cost, if you provide feedback on the process.

If you have a project that you think should be reviewed by the Ethical Data Use Committee or if you would like to recommend yourself or someone else as a member for the inaugural review term, please contact Dr. Sara Jordan at [email protected].

FPF Report Outlines Opportunities to Mitigate the Privacy Risks of AR & VR Technologies

A new report from the Future of Privacy Forum (FPF), Augmented Reality + Virtual Reality: Privacy & Autonomy Considerations in Emerging, Immersive Digital Worlds, provides recommendations to address the privacy risks of augmented reality (AR) and virtual reality (VR) technologies. The vast amount of sensitive personal information collected by AR and VR technologies creates serious risks to consumers that could undermine the adoption of these platforms and limit their utility.

fpf ar+vr report socialgraphics fb

“XR technologies are rapidly being adopted by consumers and increasingly being used for work and for education. It’s essential that guidelines are set to ensure privacy and safety while business models are being established,” said FPF CEO Jules Polonetsky. 

The report considers current and future use cases for XR technology, and provides recommendations for how platforms, manufacturers, developers, experience providers, researchers, and policymakers should implement XR responsibly, including: 

“XR technologies provide substantial benefits to individuals and society, with existing and potential future applications across education, gaming, architectural design, healthcare, gaming, and much more,” said FPF Policy Counsel and paper author Jeremy Greenberg. “XR technology systems often rely on biometric identifiers and measurements, real-time location tracking, and precise maps of the physical world. The collection of such sensitive personal information creates privacy risks that must be considered by stakeholders across the XR landscape in order to ensure this immersive technology is implemented responsibly.” 

The release of the report kicks off the start of FPF’s XR Week of activities, happening from April 19thto 23rd. XR Week will explore key elements of the report in greater detail, including the differences between various immersive technologies, their use cases, important privacy and ethical questions surrounding XR technologies, compliance challenges associated with XR technologies, and how XR technology will continue to evolve. 

FPF’s featured XR Week event, AR + VR: Privacy & Autonomy Considerations for Immersive Digital Worlds will include a conversation between FPF Policy Counsel Jeremy Greenberg and Facebook Reality Labs Director of Policy James Hairston, followed by a panel discussion with Magic Leap Senior Vice President Ana Lang, Common Sense Media Director of Platform Accountability and State Advocacy Joe Jerome, and behavioral scientist Jessica Outlaw. 

To register and learn more about FPF’s other XR Week events, read this blog post.

Supporting Responsible Research and Data Protection

Scientific research is often dependent on access to personal information, whether collected directly from individuals or collected for a real-world use and then accessed for research. For research to be trusted, processing of personal information must be lawful, ethical and subject to privacy and security protections. Supporting responsible research is a priority for FPF:

Access to Corporate Data & Ethical Review

Data held by companies is useful for researchers striving to discover new scientific insights and expand human knowledge. When corporations open their data stores and responsibly share this data with university researchers, they can support progress in medicine, public health, education, social sciences, computer science, and many other fields.

But access to the data needed is often unavailable due to a range of barriers – including the need to connect with appropriate partners, protect privacy, address commercial concerns, maintain ethical standards, and comply with legal obligations.

Issuing best practices and contract guidelines for companies sharing data with researchers. The Best Practices for Sharing Data with Academic Researchers were developed by the FPF Corporate Academic Data Stewardship Research Alliance, a group of more than two dozen companies and organizations. The best practices favor academic independence and freedom over tightly controlled research, and encourage broad publication and dissemination of research results, while protecting the privacy of individual research subjects. Specific best practices include having a written data sharing agreement, practicing data minimization, and developing a common understanding of relevant de-identification techniques, among many others. In addition, FPF published Contract Guidelines for Data Sharing Agreements Between Companies and Academic Researchers. The guidelines cover best practices and sample language that can be used in contracts with companies that supply data to researchers for academic or scientific research purposes. FPF’s Corporate Academic Data Stewardship Research Alliance and these resources, including FPF’s report, Understanding Corporate Data Sharing Decisions, were supported by the Alfred P. Sloan Foundation.

Establishing the Ethical Data Use Committee (EDUC). Through the generous support of the Schmidt Futures Foundation, FPF is preparing to launch an independent ethical review panel to evaluate the risks and benefits of organizations’ data sharing projects with academic researchers. The Ethical Data Use Committee will conduct prospective reviews of research projects using data not explicitly gathered for research purposes, such as data shared by companies to academic researchers. The EDUC is designed to work in compliment with the remainder of the research review process. The purpose of the EDUC review is to offer organizations recommendations to improve the privacy, security, and ethical profile of the research data that is not subject to review by other components of the research review infrastructure such as Institutional Review Boards or Institutional Biosafety Committees. 

This work builds on FPF’s project, Beyond IRBs: Designing Ethical Review Processes for Big Data Research, supported by the Alfred P. Sloan Foundation and U.S. National Science Foundation, which brought together government, industry, civil society, and researchers in law, ethics, and computer science to consider ethical review mechanisms for data collected in corporate, non-profit, and other non-academic settings.

Building Communities of Practice

Honoring effective data-sharing partnerships for research and sharing best practices. The FPF Award for Research Data Stewardship is a first-of-its-kind award recognizing a research partnership between a company that has shared data with an academic institution in a responsible, privacy protective manner. The 2020 award-winning partnership was between University for California, Irvine, Professor of Cognitive Science Dr. Mark Steyvers and Lumos Labs. In an FPF virtual event on September 22, 2020, Professor Steyvers and Bob Schafer, General Manager at Lumosity, discussed their award-winning collaboration and lessons learned for future data sharing partnerships between companies and academic researchers. The annual FPF Award for Research Data Stewardship is supported by the Alfred P. Sloan Foundation.

FPF has continued this award and is currently working on reviewing submissions and looks forward to announcing a 2021 winner in the early summer months.

Bringing the best academic privacy research into practice. Through its Applied Privacy Research Coordination Network, a project supported by the U.S. National Science Foundation, FPF introduces academic researchers to industry practitioners to develop working partnership opportunities and share best practices. This project builds on FPF’s first NSF-supported Research Coordination Network established to foster industry-academic collaboration on priority research issues identified in the National Privacy Research Strategy (NPRS) and inform the public debate on privacy. These projects have provided ongoing support to FPF’s Privacy Papers for Policymakers program which brings academic expertise to members of Congress and leaders of executive agencies and their staffs to better inform policy approaches to data protection issues.

Providing governments and researchers tools and guidance for evidence-based policymaking. Integrated Data Systems (IDS) use data that government agencies routinely collect in the course of delivering public services to shape local policy and practice. FPF and Actionable Intelligence for Social Policy (AISP) created the Nothing to Hide: Tools for Talking (and Listening) About Data Privacy for Integrated Data Systems toolkit to provide stakeholders with tools to lead privacy-sensitive, inclusive government IDS efforts. In addition, FPF worked with the Administrative Data Research Facilities Network (ADRF) to develop a guide for researchers and practitioners who want to share administrative data for evidence-based policy and social science research. FPF’s paper Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data published in The Annals of Political and Social Science, Vol 675 (2018) outlines the infrastructures that will need to be built to make sure data providers and empirical researchers can best serve national policy needs. FPF’s work on administrative data research was made possible by the support of the Alfred P. Sloan Foundation.

Exploring Legal Structures and Policies to Support Processing Personal Data for Research

Hosting expert discussions about processing personal data for research under the GDPR. The topic of the Brussels Privacy Symposium 2020, organized by FPF and the Brussels Privacy Hub of Vrije Universiteit Brussel (VUB), was “Research and the Protection of Personal Data Under the GDPR.” The symposium, which brought together a mix of industry practitioners, academic researchers, policymakers, and international data protection regulators, focused on striking a balance during the Covid-19 pandemic between the utility of research, on one hand, and the rights to privacy and data protection on the other. Panelists discussed strategies to mitigate risks to data protection in scientific research, including vulnerabilities related to AI and machine learning systems; consent structures; and the role of international frameworks and cross-border data flows. In a closing keynote, European Data Protection Supervisor Wojciech Wiewiórowski discussed the need to intensify the dialogue between Data Protection Authorities and ethical review boards to develop a common understanding of what qualifies as scientific research, and on codes of conduct for it. 

Examining country-level legal frameworks for secondary uses of healthcare data. On January 19-20, 2021, the Israel Tech Policy Institute (ITPI), an FPF affiliate based in Israel, co-hosted a virtual workshop in collaboration with the Organization for Economic Cooperation and Development (OECD) and the Israel Ministry of Health (IMoH), titled “Supporting Health Innovation with Fair Information Practice Principles.” The workshop furthered international dialogue on issues critical for the successful use of health data for the benefit of the public, focusing on the implementation of privacy protection principles and the challenges that arise in the process. The discussion included lessons learned during Covid-19. It provided an opportunity for delegates of the OECD Health group (HCQO) and the OECD Data Governance and Privacy in the Digital Economy group (DGP), together with experts in these fields, to discuss progress made toward implementing the 2017 OECD Recommendation on Health Data Governance, and to contribute to the ongoing review of the 2013 OECD Privacy Guidelines. Specific topics discussed included: 

The workshop was attended by delegates from approximately 40 governments from all over the world, as well as industry and academia participants.  

In conjunction with the OECD event, FPF and the Israel Tech Policy Institute have conducted a study (to be published soon) on the laws underpinning secondary uses of healthcare data for research purposes in eight countries: Australia, England, Finland, France, India, Ireland, Israel, and the U.S. We found large commonalities across legal systems and regimes, permitting secondary use of healthcare data for research purposes under certain conditions, such as review by ethical boards, proper de-identification, and other administrative, technical, and contractual safeguards. Still, differences and ambiguities remain around specific situations such as the use of ‘Consent’ or other legal bases allowing data processing, the level of anonymization and de-identification employed and how it is regarded in different countries, and a variety of approaches to transborder data flows and data localization requirements.

Guidance to government, companies and civil society on responsible data sharing in a public health crisis. FPF launched its Privacy & Pandemics series immediately after the COVID-19 pandemic began to provide information and guidance to governments, companies, academics and civil society on responsible data sharing to support public health. As a featured part of the series, FPF’s Corporate Data Sharing Workshop on March 26, 2020 convened ethicists, academic researchers, government officials and corporate leaders to discuss best practices and policy recommendations for responsible data sharing. FPF’s international tech & data conference in October 2020, presented in collaboration with the US National Science Foundation, Duke Sanford School of Public Policy, SFI ADAPT Research Centre, Dublin City University, and Intel Corporation, produced a roadmap for research, practice improvements, and development of privacy-preserving products and services to further inform responses to COVID-19 and prepare for future pandemics and crises.

Summarizing U.S. federal and state laws that apply to health data research. As a resource for policymakers, researchers, and ethicists, FPF canvassed federal and state laws and regulations regarding health data research. Regulations like the Common Rule include a wide range of protections, but only apply to certain situations, while other safeguards are triggered by high-stakes research or particularly sensitive categories of data or vulnerable research subjects.

Educating policymakers on the value of data for research and strategies for oversight. FPF has shared model bill language with lawmakers developing comprehensive privacy laws in California, Washington, and Virginia to encourage them to both protect data-driven research and create oversight by requiring it to be approved, monitored, and governed by an independent oversight entity.

Exploring how the GDPR can work for health scientific research. On October 22, 2018, FPF, together with the European Federation of Pharmaceutical Industries and Associations (EFPIA), and the Centre for Information Policy Leadership (CIPL) hosted a workshop in Brussels, “Can GDPR Work for Health Scientific Research?,” to discuss the processing of personal data for health scientific research purposes under the European Union’s General Data Protection Regulation (GDPR). The workshop identified several challenges that researchers are facing when trying to comply with the GDPR, such as identifying the appropriate lawful ground for processing personal data for clinical trials and for secondary use of health data for health scientific research purposes, the relationship between the EU Clinical Trials Regulation and the GDPR, or the lack of clarity surrounding institutional responsibility and the role of ethical committees. 

Providing guidance to US based higher education institutions on how to align their research and educational activities to the GDPR. In May 2020, FPF released, “The General Data Protection Regulation: Analysis and Guidance for US Higher Education Institutions.” The report includes a 10-step checklist with instructions for executing an effective GDPR compliance program. Many of the case-studies and examples used in the report focus on academic research. It is designed to assist both organizations with established compliance programs seeking to update or refresh their understanding of their obligations under GDPR, as well as those that are still in the process of creating or sustaining a compliance structure and seeking more in-depth guidance.

Advancing tools to support responsible research in artificial intelligence. Tofacilitate discussions around bias in artificial intelligence, FPF produced a framework to identify, articulate, and categorize the types of harm that may result from automated decision-making, see Unfairness by Algorithm: Distilling the Harms of Automated Decision-Making (December 2017). FPF has recently provided resources and guidance to state policymakers on this topic.

Sharing methods and techniques for de-identification. FPF is recognized for its signature expertise in de-identification, publishing A Visual Guide to De-Identification (April 2016), as well as law review articles like Shades of Gray: Seeing the Full Spectrum of Practical Data De-identification, 56 Santa Clara L. Rev.593 (2016).

Facilitating ethically responsible access to administrative data for privacy protective research. A paper titled Privacy Protective Research: Facilitating Ethically Responsible Access to Administrative Data was featured at a Bill and Melinda Gates Foundation funded workshop along with other white papers written by researchers and practitioners that help inform the development of a roadmap identifying what data infrastructures need to be built to ensure that data providers and empirical researchers can best serve national policy needs. The paper – by FPF CEO Jules Polonetsky, FPF Senior Fellow Omer Tene, and Alfred P. Sloan Foundation Vice President and Program Director Daniel Goroff – provides strategies for organizations to minimize risks of re-identification and privacy violations for individual data subjects.

Manipulative UX Design & the Role of Regulation: Event Highlights

screen shot 2021 03 30 at 5.13.18 pm

On March 24, the FPF hosted “Dark Patterns:” Manipulative UX Design and the Role of Regulation. So-called “dark patterns” are user interface design choices that benefit an online service by coercing, manipulative, or deceiving users into making unintended or potentially harmful decisions. The event provided a critical examination of the ways in which manipulative interfaces can limit consumer choice and explored how regulation of manipulative designs continues to expand – from California’s recent Attorney General regulations, to the California Privacy Rights Act, to other state and federal privacy bills. Participants also discussed whether truly neutral design is ever possible, and the differences between acceptable persuasion (such as in advertising) and manipulation, coercion, and deception. 

The event, moderated by FPF Senior Counsel Stacey Gray, began with a survey of legislative proposals that would regulate manipulative user interface design choices. Stacey highlighted several prominent state privacy laws, including the California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA), which define and address dark patterns in certain contexts, such as in the California Attorney General’s regulations for the design of “opt-out of sale” mechanisms for personal data collection and use. Gray also addressed relevant legislative proposals at the state and federal level – the Washington Privacy Act (SB 5062), CA SB 980, and the SAFE DATA Act (S. 4626) – that explicitly define or create regulations around manipulative design choices. Finally, Gray explained that manipulative design is an “ongoing focus” of the Federal Trade Commission, citing past enforcement actions related to manipulative user interface design choices and referencing the FTC’s upcoming April 29 workshop, “Bringing Dark Patterns to Light.”

Dr. Jennifer King, Privacy and Data Policy Fellow at the Stanford Institute for Human-Centered Artificial Intelligence, provided the keynote presentation, which defined dark patterns, the contexts they target, how they work, types of dark patterns, examples, and key thoughts for policymakers and regulators. Specifically, Dr. King recommended that lawmakers consider the following questions: 

Following Dr. King’s address, the event moved to a panel discussion with Mihir Kshirsagar, Clinic Lead for Princeton’s Center for Information Technology Policy, Tanya Forsheit, Chair of the Privacy & Data Security Group at Frankfurt Kurnit Klein + Selz, as well as Gray and Dr. King. Together, the panel considered manipulative design from legal, policy, and technology perspectives, providing insightful answers to questions from the audience. 

Gray closed the event by noting that manipulative design will continue to be a focus for FPF, previewing future convenings on manipulative design under EU and global law and in specific contexts, such as in online products and services for children and teens. 

To learn more, watch the recording of the event

Event Report: Brussels Privacy Symposium 2020 – Research and the Protection of Personal Data Under the GDPR

On December 2, 2020, the Future of Privacy Forum (FPF) and the Brussels Privacy Hub of Vrije Universiteit Brussel (VUB) hosted the Brussels Privacy Symposium 2020: Research and the protection of Personal Data Under the GDPR. The event, convened by FPF CEO Jules Polonetsky and Dr. Christopher Kuner, Co-Chair of the Brussels Privacy Hub, brought together industry privacy leaders, academic researchers, and regulators to discuss data protection in the context of scientific research under the European Union’s General Data Protection Regulation (GDPR) from various policy and technical perspectives. A new report from FPF’s Caroline Hopland, Hunter Dorwart, Dr. Gabriela Zanfir-Fortuna, and Dr. Rob van Eijk, as well as Associate Professor at the EDHEC Augmented Law Institute Dr. Gianclaudio Malgieri, summarizes and offers context to the discussions at the event.

The 2020 Brussels Privacy Symposium was the fourth-annual academic program jointly presented by VUB and FPF. Notably, the panelists emphasized risks and vulnerabilities with respect to data protection in the scientific research context, highlighting issues with consent structures, artificial intelligence (AI) and machine learning systems during the Covid-19 pandemic, as well as difficulties to define sensitive data, what privacy enhancing technologies are applied to research datasets and how they may affect efforts to identify bias, or the role of international frameworks and of cross-border data flows in facilitating or hindering research outcomes. 

The Symposium also brought into focus recent developments in EU policymaking that may have significant effects on processing personal data for research purposes. One of the relevant legislative proposals recently introduced by the European Commission is the Data Governance Act (DGA), which “aims to foster the availability of data for use by increasing trust in data intermediaries and by strengthening data-sharing mechanisms across the EU.” It also proposes to promote “data altruism,” allowing researchers access to larger datasets for their research. Overall, the Symposium focused on striking a balance between utility of research and privacy and data protection. 

The keynote speakers included: 

The first panel explored Complex Interactions: the GDPR, Data Protection and Research, and was moderated by Dr. Gianclaudio Malgieri, Associate Professor EDHEC Augmented Law Institute (Lille) and Affiliated Researcher LSTS VUB. Speakers on the panel included: 

The second panel discussed Using Sensitive Data in Research to Counter (Hidden) Bias and Discrimination, and was moderated by Dr. Gabriela Zanfir-Fortuna, Senior Counsel FPF and Affiliated Researcher LSTS VUB. Speakers included: 

To learn more, read the report.

If you have any questions about the Report, contact Dr. Gabriela Zanfir-Fortuna at [email protected] or Dr. Rob van Eijk at [email protected].

Understanding Interconnected Local and Global Data Flows

International data flows have been top of mind in the past year for digital rights advocates, companies and regulators, particularly international transfers following the Schrems II judgment of the Court of Justice of the EU from last July. As data protection authorities assess how to use technical safeguards and contractual measures to support data flows while ensuring the protection of rights and freedoms of individuals, it’s essential to understand the interconnectedness that exists today in a highly digitized environment and globalized relationships, so that guidance can be most effective.

Here, we explore the issue of the complexity of international data flows in two distinct contexts that affect daily lives of people regardless of where they live, especially during a pandemic that has moved most of daily lives remote: (I) how they shop (retail) and (II) how they engage with education services (education technology, or EdTech). We provide an infographic for each with notes to better understand the actors and the complexities of data flows between them, while having an understanding that the systems being used and the actors involved are very often established within different jurisdictions.

Click here to download the 4-page (PDF) Infographic.

Map Retail 2x

I. Understanding Retail Data Flows

The first infographic presents a highly simplified visual for a retailer. Data flows are complex for even small and medium size organizations, with partners and vendors commonly located in multiple jurisdictions. A retailer is likely to use a number of different cloud-based service providers to support consumer transactions. Many of these service providers may only be located in a single jurisdiction and be geographically dispersed. These service providers often use other service providers, and are themselves geographically distributed and interconnected.

One of the essential services provided to a retailer is payment processing which will involve:

Click here to download the 4-page (PDF) Infographic.

Map Edtech 2x

II. Understanding EdTech Data Flows

The second infographic presents a highly simplified visual of the data flows for education technology for schools and universities. Cloud based services support a wide range of programs used by teachers, students and administrators in this sector.

Schools and universities increasingly rely on EdTech applications to help educate their students. This includes online classroom/video call collaboration tools, applications to inform parents and students about important developments, learning management systems and learning content providers. Most of these providers rely on a global network of subsidiaries to support, maintain and secure their product 24/7 as well as on other service providers that deliver hosting and other specialist services. While applications and personal data of students are often hosted regionally, these subsidiaries and vendors will require access to the data for the delivery of the service.

Universities and schools will also often rely on (cloud-based) vendors to fulfill their tasks. For example:

For further information or to provide comments or suggestions, please contact Dr. Rob van Eijk ([email protected]) or Dr. Gabriela Zanfir-Fortuna ([email protected]).

The full list of FPF’s infographics can be accessed on the FPF website at fpf.org/publication-type/infographic/.

To learn more about FPF in Europe, please visit fpf.org/eu.

Acting FTC Chairwoman Slaughter Highlights Priorities in Privacy Papers for Policymakers Event Keynote

Screen Shot 2021 02 18 At 2.45.23 Pm 1

The Future of Privacy Forum’s 11th-annual Privacy Papers for Policymakers event – the first event in the series to take place virtually – was a success! This year’s event featured a keynote speech by Acting FTC Chairwoman Rebecca Kelly Slaughter and facilitated discussions between the winning authors – Amy B. Cyphert, Clarisse Girot, Brittan Heller, Tiffany C. Li, Kenneth Propp, Peter Swire, and Lauren H. Scholz – and leaders from the academic, industry, and policy landscape, including Elana Zeide, Anupam Chander, Joan O’Hara, Jared Bomberg, Alex Joel, and Syd Terry. 

Images from the 2021 Privacy Papers for Policymakers event - Acting FTC Chairwoman Slaughter
Acting FTC Chairwoman Rebecca Kelly Slaughter provided the keynote address at PPPM 2021.

In her keynote address, which was also her first major speech as acting chair of the Federal Trade Commission, Acting FTC Chairwoman Slaughter outlined three of her major privacy-related priorities for the Commission: 

1. Making enforcement more efficient and effective. Acting Chairwoman Slaughter observed how the COVID-19 pandemic has “only amplified the need for strong legislation at the federal level, as it has pushed more of our work, our children’s education, and even our personal interactions online, exacerbating data risks.” She explained that the FTC would need to “think creatively” to make enforcement efforts more effective without benefit of a federal privacy law. She posed the following guiding questions to center the discussion around enforcement: 

She specifically flagged two types of relief that she believes the FTC is well-positioned to seek and achieve: meaningful disgorgement and effective consumer notice. She also stated that the FTC needs to “think carefully about the overlap between our work in data privacy and in competition,” arguing that the FTC’s dual missions to protect privacy and maintain competitive markets “can and should be complementary” as the Commission looks at problems that arise in digital markets through both privacy and competition lenses. 

2. Protecting privacy during the pandemic. Acting FTC Chairwoman Slaughter acknowledged that responding the COVID requires an “all-hands approach” and that the FTC has several important roles to play as part of the solution, including addressing “COVID-related scams to privacy and security issues to an economic crisis.” She identified three key focus areas: 

3. Racial equity concerns in data use and abuse. Acting FTC Chairwoman Slaughter asked: “How can we at the FTC engage in the ongoing nationwide work of righting the wrongs of four hundred years of racial injustice?” She identified four key areas where the FTC could approach racial equity from the consumer protection angle: 

You can read Acting FTC Chairwoman Slaughter’s full remarks at PPPM 2021 on the FTC website

Following Acting Chairwoman Slaughter’s keynote address, the event turned to moderated discussions between the authors of the award-winning papers and leaders from the academic, industry, and policy landscapes. Click the links below to read each of the winning papers, or read the 2021 PPPM Digest to read summaries of the papers and learn more about the authors and judges. 

Thank you to Acting FTC Chairwoman Slaughter and to Honorary Co-Hosts Senator Edward Markey and Congresswoman Diana DeGette for their support and work around this event. We would also like to thank our winning authors, discussants, everyone who submitted papers, and our event attendees for thought-provoking work and support. Learn more about the event on the FPF website and watch a recording of the event on the FPF YouTube channel.

Images from the 2021 Privacy Papers for Policymakers Event

Images from the 2021 Privacy Papers for Policymakers Event - Marcus Dessalgne
FPF Law & Policy Fellow Marcus Dessalgne, the FPF lead on the PPPM project and emcee for the event.
Images from the 2021 Privacy Papers for Policymakers Event - Cyphert
Amy B. Cyphert presents her paper, Tinker-ing with Machine Learning: The Legality and Consequences of Online Surveillance of Students.
Images from the 2021 Privacy Papers for Policymakers Event - Girot
Clarisse Girot presents her paper, Transferring Personal Data in Asia: A Path to Legal Certainty and Regional Convergence.
Picture3
Brittan Heller presents her paper, Reimagining Reality: Human Rights and Immersive Technology.
Images from the 2021 Privacy Papers for Policymakers Event - Li
Tiffany C. Li presents her paper, Privacy in Pandemic: Law, Technology, and Public Health in the COVID-19 Crisis.
Images from the 2021 Privacy Papers for Policymakers Event - Propp and Swire
Kenneth Propp and Peter Swire present their paper, After Schrems II: A Proposal to Meet the Individual Redress Challenge.
Images from the 2021 Privacy Papers for Policymakers Event - Scholz
Lauren Henry Scholz presents her paper, Fiduciary Boilerplate.

CPDP2021 Event Recap: Bridging the Standards Gap

On January 27, 2021, the Institute of Electrical and Electronics Engineers (IEEE) hosted a panel at the 14th Annual International Computers, Privacy, and Data Protection Conference (CPDP2021). The theme for this year’s online conference was “Enforcing Rights in a Changing World.” Rob van Eijk, FPF Managing Director for Europe, moderated the panel “Technical Standards Bringing Together Data Protection with Telecommunications Regulation, Digital Regulations, and Procurement.”

The recording of this panel is available here.

The panelists discussed the role of technical standards in ensuring the systematic application of data protection principles across policy areas. Two examples are Article 25 GDPR and Recital 78, which stipulate data protection by design and by default. Another example is Article 21(5) of the GDPR, which stipulates that in the context of the use of information society services, and notwithstanding Directive 2002/58/EC, the data subject may exercise his or her right to object by automated means using technical specifications.

Technical standards seek to ensure that engineers can effectively apply privacy and data protection principles in the design and the (default) configuration of technology placed on the market. Therefore, a key question for the panel discussion was: How to bridge the gap between releasing technical standards and embedding them into products and services available on the market?

Facilitating Dialogue and Collaboration Between Policymakers and Technologists

Paul Nemitz, Principal Advisor in the Directorate-General for Justice and Consumers at the European Commission, started the discussion with a precise observation. He argued that building a closer, collaborative relationship between engineers and policymakers is a critical step to bridge the gap between data protection policies, technical standards, and practical application. He called on the technical intelligentsia to bring their knowledge to the policymaking process, stating that democracy needs the engagement of the engineers. Paul also expressed that policymakers should hold out their hands and open their doors to bring in those who know the technology. He identified convincing those with the technical knowledge to also engage in the formal lawmaking process as one of the challenges of our time.

He clarified that he was not claiming engineers know how to make policies better than policymakers. Instead, he defined specific roles for each group based on their areas of expertise and working languages. The technologists’ part is to inform and educate policymakers to be able to shape laws that are technology-neutral and do not have to be updated every two months. According to Paul, while engineers are adept at writing code, code is written for machines that cannot think for themselves. However, the law is written for people, not computers, and policymakers are best equipped for this language. Paul also sees the relationship as a two-way street where technologists can bring their technological knowledge to the rulemaking process then go back to their spaces with a better understanding of democracy.

Clara Neppel, Senior Director of IEEE European Business Operations, shared that all IEEE standardization bodies have a government engagement program where governments can inform the IEEE standardization organization about their policy priorities. That allows software engineers to take into account upcoming legislative measures.

Amelia Andersdotter, Dataskydd.net and Member of the European Parliament 2011-2014, stressed the importance of ensuring that standards – once set – are transparent and effectively communicated to everyone in the ecosystem, i.e., from procurement bodies to end-users. For instance, when the public sector seeks to build a public WiFi network or a company designs a website, those demanding the products should be aware of the frameworks in place that protect human rights and data as well as the right questions to ask of those further up in the value chain (See also the FPF & Dataskydd.net Webinar – Privacy in High-Density Crowd Contexts). The IEEE P802E working group has been drafting Recommended Practices for Privacy Considerations for IEEE 802 Technologies (P802E). P802E contains recommendations and checklists for IEEE 802 technologies developers).

Panelists left to right, top row: Rob van Eijk, Paul Nemitz, Clara Neppel; bottom row: Amelia Andersdotter, Mikuláš Peksa. We remark that Francesca Bria (President of the Italian National Innovation Fund) contributed to the preparation of the panel discussion. 

Privacy versus utility

According to Clara, key challenges faced when implementing privacy include the lack of common definitions, the tradeoff between privacy and functionality of the products or services, and reaching the right balance between social values embedded in data governance and technical measures. Clara pointed out that privacy can have different meanings for different parties. For example, a user may understand privacy as meaning no data is shared, only anonymous data is shared, or being able to define different access levels for different types of data. On the other hand, companies may interpret privacy as solely being compliant with relevant laws or as a true value proposition for their customers.

Amelia also identified the lack of specification of data protection principles or requirements in legislation as leading to confusion and inefficiencies for engineers and companies. She cited the case of the European Radio Equipment and Telecommunications Terminal Equipment Directive (R&TTE), adopted in 1999, which includes data protection as an essential requirement for radio technologies. However, what this requirement actually means has never been specified, resulting in companies being unable to assess whether their technologies meet the requirement. She expressed that Europe is good at establishing both the high-level values and low-level practical architectures, but could improve if regulators filled the gap and created a framework for assessing privacy.

Amelia also suggested that the diversity and flexibility of technical standards may encourage more experimentation and optimization as organizations choose which standards work best for their specific technologies and in their specific contexts.

Rob added that technologists need bounded concepts with definitions embedded in law, so they are clear and no longer open-ended. Then, the real work of figuring out how the technology should be designed to protect privacy can take place.

Rulemaking: Looking to the Future Rather Than in the Rear Mirror

Mikuláš Peksa, Member of the European Parliament, expressed skepticism with the European Parliament micromanaging technical standards. Instead, he leaned towards Parliament conveying basic values that protect human rights but not shaping the standards themselves.

Mikuláš contended that politicians may talk about past problems and, due to structural and procedural factors, politics always reacts with a certain delay. He stated that human society is in another Industrial Revolution where all human activities are changing. Therefore, society needs to adapt to be able to absorb and digest the changes. He referred to the example of online spaces for conversations being monopolized by a single company. He questioned whether this is the model society really wants and suggested exploring the idea of a decentralized architecture for social networks, like that of other mail services, to protect against centralized control.

Paul explained that data protection faces the classic syndrome of an ‘invisible risk’ due to the fact that people are unable to see themselves as victims. Where people fail to see the ‘invisible risk’, such as with atomic power, smoking, or data processing, politics arrives far too late. Looking forward, Mikuláš acknowledges that there is no single silver bullet to solve all of the problems. Therefore, Europe needs the political organization to incorporate the technical intelligentsia and find a desirable general direction to drive towards.

Next steps

In a tour-de-table, the panelists presented their answer and next steps on how to bridge the gap between releasing technical standards and embedding these into products and services available on the market.

Clara remarked that we may need policymakers to engage in technical considerations like standard setting as well as provide legal certainty clarifying questions around data control and correction. We may also need new standards, certifications, and good mechanisms to address special needs for specific contexts. For instance, when it comes to synthetic data, which is estimated to be 40 percent of all training data in two years, we lack metrics on how to measure the quality in terms of privacy, in terms of accuracy.

Amelia argued that in addition to work done by consumer protection authorities, data protection authorities, and telecommunications regulations, one could also just take the initiative in a company or a space. There are already industry efforts to ensure technologies are more privacy-sensitive and robust. Rome wasn’t built in a day and she doesn’t think our infrastructures will be either, but we can take small steps in the right direction and we will be in a better place in two or three years than we are now.

Mikuláš stated that there is no one silver bullet that will resolve all of the problems. He argued that we need a political organization that will incorporate technical intelligentsia, an organization that will be able to steer the general direction of building this new information society. Without it, we will not be able to introduce the standards in a proper way.

In closing, Paul called upon people who deeply understand the intricacies of technology to engage with the democratic legislative process of law making. In this way, they may be able to not only bring technological knowledge to the rulemaking process but also incorporate a better understanding of the realities of democracy into code.

FPF contributions to the privacy debate at CPDP2021

The recording of this panel is available here.

This year, FPF contributed to seven panels. The FPF team participated in the following panels (in alphabetical order):

Dr. Carrie Klein (panelist), Ready for a crisis: accelerated digitalization in education (organized by VUB Data Protection on the Ground);
Dr. Gabriela Zanfir-Fortuna (moderator), US privacy law: the beginning of a new era (organized by FPF);
Dr. Gabriela Zanfir-Fortuna (panelist), Augmented compliance: the case of Algorithmic Impact Assessment (organized by EPIC);
Dr. Gabriela Zanfir-Fortuna (panelist), International Data Transfers: What shall we do to avoid Schrems III? (organized by NOYB);
Jasmine Park (moderator) & Amelia Vance (panelist), Panel on Global Youth Privacy: Amplifying youth needs and voices (organized by Privacy Salon);
Jules Polonetsky (moderator), EU Digital Strategy, a holistic vision for a digital Europe (organized by CPDP);
Dr. Rob van Eijk (moderator), Technical standards bringing together data protection with telecommunications regulation, digital regulations and procurement (organized by IEEE).

To learn more about FPF in Europe, please visit fpf.org/about/eu.