South Korea: The First Case Where the Personal Information Protection Act was Applied to an AI System

As AI regulation is being considered in the European Union, privacy commissioners and data protection authorities around the world are starting to apply existing comprehensive data protection laws against AI systems and how they process personal information. On April 28th, the South Korean Personal Information Protection Commission (PIPC) imposed sanctions and a fine of KRW 103.3 million (USD 92,900) on ScatterLab, Inc., developer of the chatbot “Iruda,” for eight violations of the Personal Information Protection Act (PIPA). This is the first time PIPC sanctioned an AI technology company for indiscriminate personal information processing.

“Iruda” caused considerable controversy in South Korea in early January after complaints of the chatbot using vulgar and discriminatory racist, homophobic, and ableist language in conversations with users. The chatbot, which assumed the persona of a 20-year-old college student named “Iruda” (Lee Luda), attracted more than 750,000 users on Facebook Messenger less than a month after release. The media reports prompted PIPC to launch an official investigation on January 12th, soliciting input from industry, law, academia, and civil society groups on personal information processing and legal and technical perspectives on AI development and services.

PIPC’s investigation found that ScatterLab used KakaoTalk, a popular South Korean messaging app, messages collected by its apps “Text At” and “Science of Love” between February 2020 to January 2021 to develop and operate its AI chatbot “Iruda.” Around 9.4 billion KakaoTalk messages from 600,000 users were employed in training algorithms to develop the “Iruda” AI model, without any efforts by ScatterLab to delete or encrypt users’ personal information, including their names, mobile phone numbers, and addresses. Additionally, 100 million KakaoTalk messages from women in their twenties were added to the response database with “Iruda” programmed to select and respond with one of these messages.

With regards to ScatterLab employing users’ KakaoTalk messages to develop and operate “Iruda,” PIPC found that including a “New Service Development” clause in the terms to log into the apps “Text At” and “Science of Love” did not amount to user’s “explicit consent.” The description of “New Service Development” was determined to be insufficient for users to anticipate that their KakaoTalk messages would be used to develop and operate “Iruda.” Therefore, PIPC determined that ScatterLab processed the user’s personal information beyond the purpose of collection.

In addition, ScatterLab posted its AI models on the code sharing and collaboration platform Github from October 2019 to January 2021, which included 1,431 KakaoTalk messages revealing 22 names (excluding last names), 34 locations (excluding districts and neighborhoods), gender, and relationships (friends or romantic partners) of users. This was found to be in violation of PIPA Article 28-2(2) which states, “A personal information controller shall not include information that may be used to identify a certain individual when providing pseudonymized information to a third party.”

ScatterLab also faced accusations of collecting personal information of over 200,000 children under the age of 14 without parental consent in the development and operation of its app services, “Text At,” “Science of Love,” and “Iruda,” as its services did not require age verification prior to subscribing.

PIPC Chairman Jong-in Yoon highlighted the complexity of the case at hand and the reasons why extensive public consultation took place as part of the proceedings: “Even the experts did not agree so there was more intense debate than ever before and the ‘Iruda’ case was decided after very careful review.” He explained, “This case is meaningful in that it has made clear that companies are prohibited from indiscriminately using personal information collected for specific services without clearly informing and obtaining explicit consent from data subjects.” Chairman Yoon added, “We hope that the results of this case will guide AI technology companies in setting the right direction for the processing of personal information and provide an opportunity for companies to strengthen their self management and supervision.”

PIPC plans to be active in supporting compliant AI Systems

PIPC also stated that it seeks to help AI technology companies in improving their privacy capabilities by having AI developers and operators present a “Self-Checklist for Personal Information Protection of AI Services” on-site, as well as support on-site consulting. PIPC plans to actively support AI technology companies to develop AI and data-based industries while protecting people’s personal information.

ScatterLab responded to the decision, “We feel a heavy sense of social responsibility as an AI tech company regarding the necessity to engage in proper personal information processing in the course of developing related technologies and services,” and stated that, “Upon the PIPC’s decision, we will not only actively implement the corrective actions put forth by the PIPC but also work to comply with the law and industry guidelines related to personal information processing.”

Talking to Kids About Privacy: Advice from a Panel of International Experts

Now more than ever, as kids spend much of their lives online to learn, explore, play, and connect, it is essential to ensure their knowledge and understanding of online safety and privacy keeps pace. On May 13th, the Future of Privacy Forum and Common Sense assembled a panel of youth privacy experts from around the world for a webinar presentation on, “Talking to Kids about Privacy,” exploring both the importance of and approaches to talking to kids about privacy. Watch a recording of the webinar here.

The virtual discussion, moderated by FPF’s Amelia Vance and Jasmine Park, aimed to provide parents and educators with tools and resources to facilitate productive conversations with kids of all ages. The panelists were Rob Girling, Co-Founder of strategy and design firm Artefact, Sonia Livingstone, Professor of Social Psychology at the London School of Economics and Political Science (LSE), Kelly Mendoza, Vice President of Education Programs at Common Sense, Anna Morgan, Head of Legal and Deputy Commissioner of the Irish Data Protection Commission (DPC), and Daniel Solove, Professor of Law at George Washington University Law School and Founder of TeachPrivacy.

The first thing that parents and educators need to know? “Contrary to popular opinion, kids really care about their privacy in their personal lives, and especially now, in their digital lives,” shared panelist Sonia Livingstone. “When they understand how their data is being kept, shared, monetized and so forth, they are outraged.” To help inform youth, Livingstone curated an online toolkit with young people to answer frequently asked privacy questions that emerged from her research. 

And a close second: their views about privacy are closely shaped by their environment. “How children understand privacy is in some ways colored by the part of the world they come from and the culture and ideas about family and ideas about institutions that they can trust, and especially how far the digital world has already become something they rely upon,” Livingstone added.

Kelly Mendoza encouraged audience members to start having conversations about privacy with kids at a young age, and to get beyond the common but too simple advice to not share personal information online. Common Sense’s Digital Citizenship Curriculum provides free lesson plans to address timely topics and prepare students to take ownership of their digital lives by grade and topic. 

She also emphasized the important role that schools play in educating parents about privacy in her remarks. “It’s important that schools and educators and parents work together because really we’re finding that schools can play a really powerful role in educating parents,” Mendoza said. “Schools need to do a better job of communicating – what tools are they using? How are they rated and reviewed? What are the privacy risks? And why are they using this technology?” A useful starting point for schools and parents is Common Sense’s Family Engagement Resources Toolkit, which includes tips, activities, and other resources.  

Several panelists emphasized the critical role schools play in educating students about privacy. To do so effectively, schools engage and educate teachers to ensure they are informed and equipped to have meaningful conversations about privacy with their students. 

Anna Morgan provided a model for engaging children in informing data protection policies through classroom-based lesson plans. Recognizing that the General Data Protection Regulation (GDPR) and Data Protection Law are complex, the DPC provided teachers with a quick start guide to provide background knowledge, enabling them to engage in discussions with children about their data protection rights and entitlements. 

Privacy can be a difficult concept to explain, and there’s nothing quite like a creative demonstration to bring privacy concerns to life. One example: the DPC created a fictitious app to solicit children’s reactions to the use of their personal data. Through their consultation, Morgan shared that 60 percent of the children surveyed believed that their personal data should not be used to serve them with targeted advertising, finding it scary and creepy to have ads following them around. A full report from the consultation can be found here.

Daniel Solove also highlighted the need for educational systems to teach privacy. “Children today are growing up in a world where massive quantities of personal information are being gathered from them. They’re growing up in a world where they’re more under surveillance than any other generation. There’s more information about them online than any other generation. And the ability for them to put information online and get it out to the world is also unprecedented,” Solove noted. “So I think it’s very important that they learn about these things, and as a first step, they need to appreciate and understand the value of privacy and why it matters.”

One way for kids to learn about privacy is through storytelling. Solove recently authored a new children’s book about privacy titled, THE EYEMONGER, and shared his motivations for writing the book with the audience. “There really wasn’t anything out there that explained to children what privacy was, why we should care about it, or really any of the issues that are involved in this space, so that prompted me to try to do something about it.” He also compiled a list of resources to accompany the book and help educators and parents teach privacy to their children.

Building on the thread of creating outside-the-box interactive experiences to help kids understand privacy, Rob Girling shared with the audience a game called The Most Likely Machine, developed by Artefact Group to help preteens understand algorithms. Girling saw a need to teach algorithmic literacy given the impact on children’s lives, from determining college and job applications to search engine results. For Girling, “It’s just starting to introduce the idea that underneath algorithms are human biases and data that is often biased. That’s the key learning we want kids to take away.”

Each of the panelists shared a number of terrific resources and recommendations for parents and educators, which we have listed and linked to below, along with a few of our own.

Watch the webinar in full here, and we hope you will use and share some of the excellent resources referenced below.

Rob’s Recommended Resources

Sonia’s Recommended Resources

Kelly’s Recommended Resources

Anna’s Recommended Resources

Dan’s Recommended Resources

Additional Future of Privacy Forum Resources of note:

CPDP2021 Event Recap: Bridging the Standards Gap

On January 27, 2021, the Institute of Electrical and Electronics Engineers (IEEE) hosted a panel at the 14th Annual International Computers, Privacy, and Data Protection Conference (CPDP2021). The theme for this year’s online conference was “Enforcing Rights in a Changing World.” Rob van Eijk, FPF Managing Director for Europe, moderated the panel “Technical Standards Bringing Together Data Protection with Telecommunications Regulation, Digital Regulations, and Procurement.”

The recording of this panel is available here.

The panelists discussed the role of technical standards in ensuring the systematic application of data protection principles across policy areas. Two examples are Article 25 GDPR and Recital 78, which stipulate data protection by design and by default. Another example is Article 21(5) of the GDPR, which stipulates that in the context of the use of information society services, and notwithstanding Directive 2002/58/EC, the data subject may exercise his or her right to object by automated means using technical specifications.

Technical standards seek to ensure that engineers can effectively apply privacy and data protection principles in the design and the (default) configuration of technology placed on the market. Therefore, a key question for the panel discussion was: How to bridge the gap between releasing technical standards and embedding them into products and services available on the market?

Facilitating Dialogue and Collaboration Between Policymakers and Technologists

Paul Nemitz, Principal Advisor in the Directorate-General for Justice and Consumers at the European Commission, started the discussion with a precise observation. He argued that building a closer, collaborative relationship between engineers and policymakers is a critical step to bridge the gap between data protection policies, technical standards, and practical application. He called on the technical intelligentsia to bring their knowledge to the policymaking process, stating that democracy needs the engagement of the engineers. Paul also expressed that policymakers should hold out their hands and open their doors to bring in those who know the technology. He identified convincing those with the technical knowledge to also engage in the formal lawmaking process as one of the challenges of our time.

He clarified that he was not claiming engineers know how to make policies better than policymakers. Instead, he defined specific roles for each group based on their areas of expertise and working languages. The technologists’ part is to inform and educate policymakers to be able to shape laws that are technology-neutral and do not have to be updated every two months. According to Paul, while engineers are adept at writing code, code is written for machines that cannot think for themselves. However, the law is written for people, not computers, and policymakers are best equipped for this language. Paul also sees the relationship as a two-way street where technologists can bring their technological knowledge to the rulemaking process then go back to their spaces with a better understanding of democracy.

Clara Neppel, Senior Director of IEEE European Business Operations, shared that all IEEE standardization bodies have a government engagement program where governments can inform the IEEE standardization organization about their policy priorities. That allows software engineers to take into account upcoming legislative measures.

Amelia Andersdotter, Dataskydd.net and Member of the European Parliament 2011-2014, stressed the importance of ensuring that standards – once set – are transparent and effectively communicated to everyone in the ecosystem, i.e., from procurement bodies to end-users. For instance, when the public sector seeks to build a public WiFi network or a company designs a website, those demanding the products should be aware of the frameworks in place that protect human rights and data as well as the right questions to ask of those further up in the value chain (See also the FPF & Dataskydd.net Webinar – Privacy in High-Density Crowd Contexts). The IEEE P802E working group has been drafting Recommended Practices for Privacy Considerations for IEEE 802 Technologies (P802E). P802E contains recommendations and checklists for IEEE 802 technologies developers).

Panelists left to right, top row: Rob van Eijk, Paul Nemitz, Clara Neppel; bottom row: Amelia Andersdotter, Mikuláš Peksa. We remark that Francesca Bria (President of the Italian National Innovation Fund) contributed to the preparation of the panel discussion. 

Privacy versus utility

According to Clara, key challenges faced when implementing privacy include the lack of common definitions, the tradeoff between privacy and functionality of the products or services, and reaching the right balance between social values embedded in data governance and technical measures. Clara pointed out that privacy can have different meanings for different parties. For example, a user may understand privacy as meaning no data is shared, only anonymous data is shared, or being able to define different access levels for different types of data. On the other hand, companies may interpret privacy as solely being compliant with relevant laws or as a true value proposition for their customers.

Amelia also identified the lack of specification of data protection principles or requirements in legislation as leading to confusion and inefficiencies for engineers and companies. She cited the case of the European Radio Equipment and Telecommunications Terminal Equipment Directive (R&TTE), adopted in 1999, which includes data protection as an essential requirement for radio technologies. However, what this requirement actually means has never been specified, resulting in companies being unable to assess whether their technologies meet the requirement. She expressed that Europe is good at establishing both the high-level values and low-level practical architectures, but could improve if regulators filled the gap and created a framework for assessing privacy.

Amelia also suggested that the diversity and flexibility of technical standards may encourage more experimentation and optimization as organizations choose which standards work best for their specific technologies and in their specific contexts.

Rob added that technologists need bounded concepts with definitions embedded in law, so they are clear and no longer open-ended. Then, the real work of figuring out how the technology should be designed to protect privacy can take place.

Rulemaking: Looking to the Future Rather Than in the Rear Mirror

Mikuláš Peksa, Member of the European Parliament, expressed skepticism with the European Parliament micromanaging technical standards. Instead, he leaned towards Parliament conveying basic values that protect human rights but not shaping the standards themselves.

Mikuláš contended that politicians may talk about past problems and, due to structural and procedural factors, politics always reacts with a certain delay. He stated that human society is in another Industrial Revolution where all human activities are changing. Therefore, society needs to adapt to be able to absorb and digest the changes. He referred to the example of online spaces for conversations being monopolized by a single company. He questioned whether this is the model society really wants and suggested exploring the idea of a decentralized architecture for social networks, like that of other mail services, to protect against centralized control.

Paul explained that data protection faces the classic syndrome of an ‘invisible risk’ due to the fact that people are unable to see themselves as victims. Where people fail to see the ‘invisible risk’, such as with atomic power, smoking, or data processing, politics arrives far too late. Looking forward, Mikuláš acknowledges that there is no single silver bullet to solve all of the problems. Therefore, Europe needs the political organization to incorporate the technical intelligentsia and find a desirable general direction to drive towards.

Next steps

In a tour-de-table, the panelists presented their answer and next steps on how to bridge the gap between releasing technical standards and embedding these into products and services available on the market.

Clara remarked that we may need policymakers to engage in technical considerations like standard setting as well as provide legal certainty clarifying questions around data control and correction. We may also need new standards, certifications, and good mechanisms to address special needs for specific contexts. For instance, when it comes to synthetic data, which is estimated to be 40 percent of all training data in two years, we lack metrics on how to measure the quality in terms of privacy, in terms of accuracy.

Amelia argued that in addition to work done by consumer protection authorities, data protection authorities, and telecommunications regulations, one could also just take the initiative in a company or a space. There are already industry efforts to ensure technologies are more privacy-sensitive and robust. Rome wasn’t built in a day and she doesn’t think our infrastructures will be either, but we can take small steps in the right direction and we will be in a better place in two or three years than we are now.

Mikuláš stated that there is no one silver bullet that will resolve all of the problems. He argued that we need a political organization that will incorporate technical intelligentsia, an organization that will be able to steer the general direction of building this new information society. Without it, we will not be able to introduce the standards in a proper way.

In closing, Paul called upon people who deeply understand the intricacies of technology to engage with the democratic legislative process of law making. In this way, they may be able to not only bring technological knowledge to the rulemaking process but also incorporate a better understanding of the realities of democracy into code.

FPF contributions to the privacy debate at CPDP2021

The recording of this panel is available here.

This year, FPF contributed to seven panels. The FPF team participated in the following panels (in alphabetical order):

Dr. Carrie Klein (panelist), Ready for a crisis: accelerated digitalization in education (organized by VUB Data Protection on the Ground);
Dr. Gabriela Zanfir-Fortuna (moderator), US privacy law: the beginning of a new era (organized by FPF);
Dr. Gabriela Zanfir-Fortuna (panelist), Augmented compliance: the case of Algorithmic Impact Assessment (organized by EPIC);
Dr. Gabriela Zanfir-Fortuna (panelist), International Data Transfers: What shall we do to avoid Schrems III? (organized by NOYB);
Jasmine Park (moderator) & Amelia Vance (panelist), Panel on Global Youth Privacy: Amplifying youth needs and voices (organized by Privacy Salon);
Jules Polonetsky (moderator), EU Digital Strategy, a holistic vision for a digital Europe (organized by CPDP);
Dr. Rob van Eijk (moderator), Technical standards bringing together data protection with telecommunications regulation, digital regulations and procurement (organized by IEEE).

To learn more about FPF in Europe, please visit fpf.org/about/eu.

The European Commission Considers Amending the General Data Protection Regulation to Make Digital Age of Consent Consistent

The European Commission published a Communication on its mandated two-year evaluation of the General Data Protection Regulation (GDPR) on June 24, 2020 in which it discusses as a future policy development “the possible harmonisation of the age of children consent in relation to information society services.” Notably, harmonizing the age of consent for children across the European Union is one of only two areas in the GDPR that the Commission is considering amending after further review of practice and case-law. Currently, the GDPR allows individual Member States some flexibility in determining the national age of digital consent for children between the ages of 13 and 16. However, upon the two-year review, the Commission expressed concerns that the variation in ages across the EU results in a level of uncertainty for information society services–any economic activities taking place online–and may hamper “cross-border business, innovation, in particular as regards new technological developments and cybersecurity solutions.”

“For the effective functioning of the internal market and to avoid unnecessary burden on companies, it is also essential that national legislation does not go beyond the margins set by the GDPR or introduces additional requirements when there is no margin,” stated the Commission in its report. Some believe stringent child privacy requirements can push companies to abandon the development of online services for children to avoid legal risks and technical burdens, which creates a void for companies from countries with lax child privacy protections. In addition to the GDPR’s varying ages of digital consent, there are also differing interpretations of the obligations on information society services regarding children. For example, the United Kingdom’s proposed Age Appropriate Design Code defines a child as a person under the age of 18 and lays out additional requirements for information society services to build in privacy by design to better protect children online.

Prior to the GDPR, European data protection law did not include special protections for children, instead providing the same privacy protections across all age groups. The GDPR recognized that children are particularly vulnerable to harm and exploitation online and included provisions extending a higher level of protection for children. However, a universal consensus on the age of a child does not exist, and the flexibility provided by the GDPR creates a fragmented landscape of ages requiring parental consent across the EU. While complying with different ages of consent is relatively straightforward in the physical world where activities are generally limited within national boundaries, given the nature of online services operating across states, the lack of consistency of ages is a significant barrier for companies. Information society service providers are obliged to verify the age of a user, their nationality, and confirm the age of consent for children for that Member State prior to allowing access to their services. This burden may pose a competitive disadvantage for companies operating in the EU or result in measures depriving children and teens the benefits of using these services, as companies choose either to invest significant resources in age verification and parental consent mechanisms or to abandon the market for children and age gate their services instead.

The Commission also initiated a pilot project to create an infrastructure for implementing child rights and protection mechanisms online, which is scheduled to commence on January 1, 2021. The project aims to map existing age-verification and parental consent mechanisms both in the EU and abroad and assess the comprehensive mapping results to create “an interoperable infrastructure for child online protection including in particular age-verification and obtaining parental consent of users of video-sharing platforms or other online services.”

Currently, Member States require or recommend varying age verification and parental consent mechanisms. In addition to the UK’s Age Appropriate Design Code, the German youth protection law requires businesses to use scheduling restrictions to ensure that content harmful to children is not available during the day when children are online; to use technical methods to keep children from accessing inappropriate content, such as sending adults a PIN after age verification; or to use age labeling that youth protection software, downloaded by parents on their children’s devices, can read. However, the efficacy of these methods is unclear and unproven. As such, a sweeping review of existing methods may reveal best practices to be widely adopted within the EU and serve as a model for other countries, including the United States.