This Year's Must-Read Privacy Papers: The Future of Privacy Forum Announces Recipients of Annual Privacy Papers for Policymakers Award

FOR IMMEDIATE RELEASE

December 17, 2018

Contact: Jeremy Greenberg, Georgetown Policy Fellow, [email protected]

Nat Wood, [email protected], 410-507-7898

This Year’s Must-Read Privacy Papers: The Future of Privacy Forum Announces Recipients of Annual Privacy Papers for Policymakers Award

Washington, DC – Today, the Future of Privacy Forum announced the winners of the 9th Annual Privacy Papers for Policymakers Award. The PPPM Award recognizes leading privacy scholarship that is relevant to policymakers in the U.S. Congress, at U.S. federal agencies, and for data protection authorities abroad. The winners of the 2018 PPPM Award are:

From many nominated privacy-related papers published in the last year, these five were selected by a diverse team of academics, advocates, and industry privacy professionals from FPF’s Advisory Board. These papers demonstrate a thoughtful analysis of emerging issues and propose new means of analysis that can lead to real-world policy impact, making them “must-read” privacy scholarship for policymakers.

Two papers were selected for Honorable Mention: Regulating Bot Speech, by Madeline Lamo, United States Court of Federal Claims and Ryan Calo, University of Washington School of Law; and The Intuitive Appeal of Explainable Machines, by Andrew D. Selbst, Yale Information Society Project and Solon Barocas, Cornell University.

For the third year in a row, FPF also granted a Student Paper Award. For this award, student work must meet similar guidelines as those set for the general Call for Nominations. The Student Paper Award is presented to Diffusion of User Tracking Data in the Online Advertising Ecosystem, by Muhammad Ahmad Bashir and Christo Wilson, Northeastern University.

“Academic scholarship can serve as a valuable resource for policymakers considering potential privacy legislation,” said Jules Polonetsky, FPF’s CEO. “Now more than ever, topics such as artificial intelligence, algorithmic discrimination, connected cars, and transatlantic data flows are at the forefront of the privacy debate. These papers are ‘must-reads’ for any thoughtful legislator or government executive who wants to make an impact in this rapidly evolving space.”

The winning authors have been invited to join FPF and Honorary Co-Hosts Senator Edward J. Markey and Congresswoman Diana DeGette to present their work at the U.S. Senate with policymakers, academics, and industry privacy professionals. This annual event will be held on February 06, 2019. FPF will subsequently publish a printed digest of summaries of the winning papers for distribution to policymakers, privacy professionals, and the public.

The PPPM event is free, open to the general public, and widely attended. To RSVP, please visit privacypapersforpolicymakers.eventbrite.com. This event is supported by a National Science Foundation grant. Any opinions, findings and conclusions or recommendations expressed in these papers are those of the authors and do not necessarily reflect the views of the National Science Foundation.

###

The Future of Privacy Forum (FPF) is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.

 

 

Jules Polonetsky interviewed on C-SPAN's Washington Journal

FPF CEO Jules Polonetsky was interviewed on C-SPAN’s Washington Journal Friday. He discussed the need for federal privacy legislation, internet companies’ data collection practices and the Federal Trade Commission’s authority to stop deceptive practices, among other topics. Watch the appearance here.

The HIPAA Privacy Rule 15 Years Later: What’s Next?

On December 4th, FPF, Intel, and Duke in DC hosted “The HIPAA Privacy Rule 15 Years Later: What’s Next?” The event brought together stakeholders across the health data ecosystem to explore the current challenges related to the Health Information Portability and Accountability Act (HIPAA) Privacy Rule. Specifically, the discussion focused on solutions to mitigate restrictions to data sharing in clinical care and research due to administrative burdens, while at the same time maintain the privacy of protected health information (PHI).

This event follows the release of a Department of Health and Human Services’ (HHS) request for public comment regarding potential revisions to the HIPAA Privacy Rule. HHS seeks the public’s views regarding how the rules could be updated to encourage coordinated care and case management among hospitals, physicians, payors, and patients. The agency will also ask stakeholders to identify regulatory burdens that may impede value-based health care without providing commensurate privacy or security protections for PHI. HHS has the authority to modify HIPAA privacy standards – experts expect the agency’s request for comment to be the first step in a comprehensive reassessment and revision of health privacy rules. Comments are due February 11, 2019.

Health privacy experts highlighted several issues during the panels, including: the current administrative burdens that the notice of privacy practices and the accounting of disclosures requirements place on covered entities; the benefits of HIPAA privacy boards; and the opportunity to align the Common Rule with the HIPAA Privacy Rule.

Below we describe in further detail the panel discussions topics:

Panel Discussion 1: Reducing Burdens and Enhancing Care

The HIPAA Privacy Rule was developed to safeguard the privacy of personal health information while improving the quality of patient healthcare. The rule came into effect in 2003, and the last major amendment to the rule occurred in 2013 with the Omnibus Rule. Some believe HIPAA imposes burdens that hamper coordination and delivery of care and the transition to value-based care. Technologies like the internet of things, electronic health records, and cloud services are transforming how care is delivered. Health technologies that fall outside the scope of HIPAA – such as mhealth apps and wearables — are increasingly used by patients. These developments put pressure on the balances struck by the US health privacy regime. Some challenges related to HIPAA and clinical care were discussed by panelists, who argued that:

Panel Discussion 2: Enabling Research and Maintaining Privacy

Today, the average person generates over 1 million gigabytes of health-related data during a lifetime. New data types are expanding beyond the traditional healthcare setting and beyond HIPAA–such as real world evidence (RWE) and big data–and are being used for healthcare purposes. Researchers also are developing novel techniques–including AI, machine learning, and big data analytics–that were not anticipated when the HIPAA Privacy Rule was written. These advancements are prompting stakeholders to reconsider whether the status quo under the HIPAA Privacy Rule and the Common Rule is sufficient to protect privacy, address the evolving health data ecosystem, and harness the benefits of health data for patients and society. Challenges related to the intersection of HIPAA and medical research were discussed by panelists, who observed that:

Panelists also discussed how HIPAA might be addressed by any comprehensive federal privacy legislation, and whether or not exemption from such a law would be the right path forward.

Full recordings from the event are below.


Videos


Welcome and Opening Remarks

Panel Discussion: Reducing Burdens and Enhancing Care

Panel Discussion: Enabling Research and Maintaining Privacy

Closing Statements

Full house at IAPP Brussels interested in Deciphering Legitimate Interests. Download our LI Report here!

The session that the Future of Privacy Forum organized for the IAPP Europe Congress in Brussels on November 28, Deciphering “legitimate interests”: actual enforcement cases and tested solutions, generated great interest among privacy professionals. We had a full house attending – more than 500 participants, according to the IAPP. The panel was based on a Report published earlier this year by the FPF and Nymity.

The discussion was moderated by Eduardo Ustaran (co-director of the global Privacy and Cybersecurity practice of Hogan Lovells), while the panelists were Joelle Jouret (Legal Officer, European Data Protection Board) together with the co-authors of the FPF-Nymity Report on Legitimate Interests, Gabriela Zanfir-Fortuna (Policy Counsel, FPF) and Teresa Troester-Falk (Chief Global Strategy Director, Nymity).

Given that relying on legitimate interests as a lawful ground for processing under the GDPR always requires a case-by-case analysis, the participants indicated that they appreciated the discussion over concrete cases decided by the Court of Justice of the EU, the summary and discussion of a couple of cases at Member State level involving both decisions of Data Protection Authorities and national Courts, as well as the specific advice on how to operationalize the Legitimate Interest Assessment based on a previous analysis of multiple cases decided by DPAs and courts.

You can download the Report containing summaries of approximately 40 cases where processing on the basis of legitimate interests was at issue following this LINK.

Make sure you SUBSCRIBE to our Newsletter to be the first one to access our future reports and information about our public events!     

FPF Partner in algoaware Project Releases State of the Art Report

State of the Art Report:

After reviewing the literature and consulting a variety of experts, algoaware has released the first public version of the State of the Art Report, open for peer review. The report includes a comprehensive explanation of the key concepts of algorithmic decision-making, a summary of the academic debate and its most pressing issues, as well as an overview of the most recent and relevant initiatives and policy actions of the civil society as well as of national and international governing bodies. The peer-review is executed via four different engagements channels:

Background:

The algoaware study was procured by the European Commission to support its analysis of the opportunities and challenges emerging where algorithmic decisions have a significant bearing on citizens, particularly where they produce societal or economic effects which need public attention.

The study is carried out by Optimity Advisors and follows a call from the European Parliament for a pilot project supporting algorithmic awareness building.  FPF is a partner in this project.

The objectives of the study include:

The study will follow a policy design methodology resting on the analysis of scientific evidence as well as a robust stakeholder engagement. It aims to engage with a range of stakeholders across diverse sectors as we seek to map the areas of interest where algorithmic operations bear significant policy implications. To keep up to date with the debate, and for project updates, sign up for the algoaware newsletter here.

New Guide Compares Privacy Laws in EU and California

FOR IMMEDIATE RELEASE

December 7, 2018

Contact: Nat Wood, [email protected], 410-507-7898

New Guide Compares Privacy Laws in EU and California

Guide and December 13 Webinar from FPF and DataGuidance

Explore GDPR & CCPA, Potential Federal Privacy Law

Washington, DC – The Future of Privacy Forum and DataGuidance have released a new report, Comparing privacy laws: GDPR v. CCPA, which analyzes and contrasts the European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act of 2018 (CCPA). GDPR, which became effective in the EU on May 25, 2018, and the CCPA, scheduled to go into effect January 1, 2020, both aim to protect individuals’ personal data and apply to businesses that collect, use or share that data, online or off.

“There is a growing consensus about the need for comprehensive federal privacy legislation in the U.S.,” said Jules Polonetsky, CEO of the Future of Privacy Forum. “Policymakers will appreciate the insights and comparisons in this report even though the scope and approach for a federal privacy law will rightly differ from those of the GDPR or the CCPA.”

The report details how the two laws differ in significant ways, including their scope of applicability, the extent of collection limitations and rules concerning accountability. However, they are similar in certain definitions, the establishment of additional protections for people under age 16 and the inclusion of rights to access personal information, among other provisions.

“Given the size and influence of the EU and California, their privacy rules will each have a global effect,” said David Longford, CEO of DataGuidance. “Organizations around the world will find the guide helpful in understanding and complying with the GDPR and the CCPA.”

The guide compares the two pieces of legislation based on their scope, key definitions, legal basis, the rights they provide, and their approach to enforcement. Each topic includes relevant articles and sections from the two laws, a summary of the comparison, and a detailed analysis of the similarities and differences between the GDPR and the CCPA.

On December 13, 2018, at 10:00 a.m. EST, FPF and DataGuidance will host a GDPR v. CCPA webinar to compare the two laws and discuss future developments in California and at the federal level. Speakers will include FPF Policy Counsel Stacey Gray and Gabriela Zanfir-Fortuna and DataGuidance CEO David Longford and Global Privacy Director Alexis Kateifides. Those interested in participating in the webinar may register at https://register.gotowebinar.com/register/6490805661630991885.

FPF has long supported a comprehensive federal consumer privacy law, believing that both businesses and consumers will gain from one clear standard that provides necessary protections for consumers and certainty and guidance for industry. FPF recommends that such a law address issues of interoperability with existing federal sectoral laws and global privacy frameworks while avoiding conflicts with existing requirements in order to promote beneficial cross-border data flows.

###

The Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more at www.fpf.org.

[Webinar] GDPR vs. CCPA: An in depth Comparative Analysis (Thurs, Dec 13, at 10:00 AM ET)

The Future of Privacy Forum (FPF) and DataGuidance have released a new Comparison Guide on the GDPR vs. CCPA, which provides an in-depth analysis on the EU General Data Protection Regulation (GDPR) and the California Consumer Privacy Act of 2018 (CCPA). The Guide highlights the degree of similarity of the GDPR and CCPA on the five key provisions, and a detailed analysis of the similarities and differences.

Please join the Webinar on Thursday, December, 13, 2018, 10:00 AM (EST), and meet the experts from FPF and DataGuidance who will provide an overview of the Guide’s key findings on the similarities and variances between the two laws.

Speakers:

REGISTER HERE

Calls for Regulation on Facial Recognition Technology

Today, Microsoft’s Brad Smith released a call for action regarding the design, implementation and use of facial recognition systems. For use in both commercial and government contexts, he sets forth a clear path towards ethical uses of FR by ensuring privacy and discrimination concerns are addressed upfront. Likewise, AI Now, at New York University, issued a report about the potential concerns, and joined the growing demand for policies and regulatory action around this technology.

FPF published Privacy Principles for Facial Recognition Technology in Consumer Applications in September, 2018. These Principles define a benchmark of privacy requirements for those commercial situations where technology collects, creates, and maintains a facial template that can be used to identify a specific person – enabling the beneficial applications and services, while providing the necessary protections for individuals.

We include seven core privacy principles that address the concerns surrounding personally identifiable information (PII) collected by these systems.  They include:

In particular, we call for a baseline of express consent upon enrollment in a facial recognition database for verification or identification purposes. We believe these Principles can be used by companies and regulators as a resource for the development, refinement, and implementation of facial recognition technology in commercial settings.

We also released the associated graphic Understanding Facial Detection, Characterization, and Recognition Technologies as an educational reference to summarizes the key distinctions between facial scanning technologies. Relating each technology to its common use cases, benefits, concerns, and risk of identifiability, we outlined the minimum recommended notice and consent requirements and the Operator’s responsibilities.

We look forward to working with Microsoft, others in industry, and policymakers to “create policies, processes, and tools” to make responsible use of Facial Recognition technology a reality.

Privacy War Games Participants Stayed a Step Ahead of the Competition

 

Privacy leaders from 60 companies gathered at Cisco headquarters in San Jose, CA on November 12th for the inaugural Privacy War Games, a new training and preparedness program launched by FPF and The Providence Group. The war games split participants into five teams to practice strategic decision-making in a fast-paced environment that presented the challenges many companies can encounter in their every-day practice. This will help participants better manage future privacy risk – an increasingly complex task that is made more difficult by: the increasing number of state and sectoral privacy laws; evolving regulatory and compliance requirements; and the regulatory and legal ambiguity of the European General Data Protection Regulation (GDPR).

In light of a rapidly changing legal and regulatory environment, privacy risk management has grown increasingly complex for even the most advanced companies. The war games exercise forced our participants to explore privacy scenarios from different perspectives by adopting roles on the game teams that did not necessarily comport with their current jobs. By role-playing as the Federal Trade Commission, European Union regulators, state legislators, and two fictional companies, participants gained a deeper (and sometimes counter-intuitive) understanding of privacy challenges and anticipated how each team’s moves would affect the scenario as a whole.

The Privacy War Games team encouraged a commitment to authenticity throughout the exercise. Players withheld information, made decisions with limited information, dealt with unreasonable partners and managed stressful interactions with media and regulators.. Referees were assigned to each team in order to answer questions about rules and options available to the teams at various points in the game.

The exercise unfolded in two rounds. During the lunch break referees and a facilitator processed each team’s round one decisions. After lunch, the teams learned the consequences of their decisions and proceeded to make their round two decisions based on additional facts.

In a final debriefing, the control group facilitated a discussion, asking participants what they found surprising and what they learned. Answers ranged from insights gained about the scope of regulators’ authority to lessons learned about controlling the amount of information individuals in their own company should receive when there is a privacy incident. Several participants commented on how important it is to consider who needs to be at the table when a decision gets made. Companies need to have constructive conversations with a diverse team – even when departments have competing priorities.

Participants completed a postgame survey before leaving. Their feedback indicated that the event was well-received and provided suggestions for making the next war games event even better. Participants especially appreciated adopting the perspective of unfamiliar actors. In the few days after the event, FPF has already received inquiries on when the next PWG will take place.

According to a recent survey by PriceWaterHouseCoopers, only one-third of business leaders worldwide feel confident that their organization is prepared to meet recent and emerging requirements for cybersecurity, data privacy, and data-use governance. The 60 companies who participated in our war games are now ahead of the competition thanks to the valuable experiences and best practices that they acquired from this exercise.

We look forward to conducting more war games in the future. To learn about bringing the Privacy War Games to your company, contact [email protected].

Privacy Scholarship Research Reporter: Issue 4, December 2018 – GDPR in Focus

Notes from FPF

The General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) aims to guarantee strong protections for individuals regarding their personal data and apply to businesses that collect, use, or share consumer data, whether the information was obtained online or offline.

The GDPR went into effect on May 25, 2018 and is one of the most comprehensive data protection laws in the world to date.  The law represents the most comprehensive data protection reform in a generation.  Its geographic scope extends far beyond the borders of Europe, and material scope reaches across all industries – including online services, mobile, cloud, IoT, financial services, healthcare, and telecom.

In this issue are articles that provide highlight key issues raised by the GDPR: how does the Article 20 right to data portability address (or not address) privacy concerns about onward transfer of personal information? How might privacy risks raised by the internet of things be mitigated by a GDPR-compliant transparency model? How might the GDPR right to explanation be implemented in a flexible and practical manner? How do the GDPR and ePrivacy Directive intersect with the standard contractual terms used by many online services? How can children’s privacy rights be best supported by the GDPR?  Is the right to “legibility” the appropriate way to interpret and apply GDPR’s rights to explanation? The papers highlighted in this issue engage with these questions and more.

As always, we would love to hear your feedback on this issue. You can email us at [email protected].


Data Portability and Data Control: Lessons for an Emerging Concept in EU Law

I. GRAEF, M. HUSOVEC

This article observes that while Article 20 of the GDPR introduces the right to data portability, it is agnostic as it relates to how this data can be used once transferred. The authors state that unlike other initiatives, the right to data portability does not create ownership control over the ported data. How this regulation will be limited in that it may clash with the intellectual property rights of some current data holders (i.e. copyright, trade secrets etc) is discussed. The authors argue that as other regimes try to replicate the right to data portability, they should consider the resulting control, its breadth and its impact on incentives to innovate.

Authors’ Abstract

The right to data portability (‘RtDP’) introduced by Article 20 of the General Data Protection Regulation (‘GDPR’) is a first regulatory attempt to establish a general-purpose control mechanism of horizontal application which mainly aims to facilitate reuse of personal data held by private companies. Article 20 GDPR is agnostic about the type of use that follows from the ported data and its further diffusion. This contrast with forms of portability facilitated under competition law which can only occur for purpose-specific goals with the aim of addressing anticompetitive behaviour. Unlike some upcoming initiatives, the RtDP still cannot be said to create ownership-like control over ported data. Even more, this regulatory innovation will be limited in its aspirations where intellectual property rights of current data holders, such as copyright, trade secrets and sui generis database rights, cause the two regimes to clash. In such cases, a reconciliation of the interests might confine particularly the follow-on use of ported data again to specific set of socially justifiable purposes, possibly with schemes of fair remuneration. We argue that to the extent that other regimes will try to replicate the RtDP, they should closely consider the nature of the resulting control, its breadth and its impact on incentives to innovate. In any case, the creation of data portability regimes should not become an end in itself. With an increasing number of instruments, orchestrating the consistency of legal regimes within the Digital Single Market and their mutual interplay should become an equally important concern.

“Data Portability and Data Control: Lessons for an Emerging Concept in EU Law” by I. Graef, M. Husovec TILEC Discussion Paper No. 2017-041 Tilburg Law School Research Paper No. 2017/22.

 

GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy

S. WACHTER

Presented in this paper is a three-step transparency model based on known privacy risks of the IoT, the GDPR’s governing principles, and weaknesses in its relevant provisions. In an effort to help IoT developers and data controllers, eleven ethical guidelines are proposed focused on how information about the functionality of the IoT should be shared with users above the GDPR’s legally binding requirements. There are two case studies presented that demonstrate how the guidelines apply in practice: IoT in public spaces and connected cities, and connected cars.

Authors’ Abstract

The Internet of Things (IoT) requires pervasive collection and linkage of user data to provide personalised experiences based on potentially invasive inferences. Consistent identification of users and devices is necessary for this functionality, which poses risks to user privacy. The forthcoming General Data Protection Regulation (GDPR) contains numerous provisions relevant to these risks, which may nonetheless be insufficient to ensure a fair balance between users’ and developers’ interests. A three-step transparency model is described based on known privacy risks of the IoT, the GDPR’s governing principles, and weaknesses in its relevant provisions. Eleven ethical guidelines are proposed for IoT developers and data controllers on how information about the functionality of the IoT should be shared with users above the GDPR’s legally binding requirements. Two use cases demonstrate how the guidelines apply in practice: IoT in public spaces and connected cities, and connected cars.

“GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy” by S. Wachter Wachter, Sandra, GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy (February 5, 2018).

 

Meaningful Information and the Right to Explanation

A. D, SELBST, J. POWLES

The authors believe the discourse about the right to explanation has, thus far, gone in an unproductive direction. The authors posit that there is a fierce disagreement over whether these provisions create a data subject’s ‘right to explanation’. This article attempts to reorient that debate by showing that the plain text of the GDPR supports such a right. The authors believe that the right to explanation should be interpreted functionally, flexibly, and at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law. To make their point, they offer a critique of the two most prominent papers in the debate.

Authors’ Abstract

There is no single, neat statutory provision labelled the right to explanation in Europe’s new General Data Protection Regulation (GDPR). But nor is such a right illusory.

Responding to two prominent papers that, in turn, conjure and critique the right to explanation in the context of automated decision-making, we advocate a return to the text of the GDPR.

Articles 13–15 provide rights to meaningful information about the logic involved in automated decisions. This is a right to explanation, whether one uses the phrase or not.

The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law.

“Meaningful Information and the Right to Explanation” by A. D, Selbst, J. Powles International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 233–242.

 

Pre-Formulated Declarations of Data Subject Consent – Citizens-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections

D. CLIFFORD, I. GRAEF, AND P. VALCKE

This article examines how the respective data protection and privacy, consumer protection, and competition law policy agendas are aligned by looking through the lens of pre-formulated declarations of consent whereby data subjects agree to the processing of their personal data by accepting standard terms. The authors describe the role each area has as it relates to the GDPR and ePrivacy Directive, the Unfair Terms Directive, the Consumer Rights Directive and the proposed Digital Content Directive in addition to market dominance. This paper discusses the complicated issue of the economic value of personal data and tries to interpret the affects of this cross-reference.

Authors’ Abstract

The purpose of this article is to examine the alignment of the respective data protection and privacy, consumer protection and competition law policy agendas through the lens of pre-formulated declarations of consent whereby data subjects agree to the processing of their personal data by accepting standard terms. The article aims to delineate the role of each area with specific reference to the GDPR and ePrivacy Directive, the Unfair Terms Directive, the Consumer Rights Directive and the proposed Digital Content Directive in addition to market dominance. Competition law analysis is explored vis-à-vis whether it could offer indicators of when ‘a clear imbalance’ in controller-data subject relations may occur in the context of the requirement for consent to be ‘freely given’ as per its definition in the GDPR. This complements the data protection and consumer protection analysis which focuses on the specific reference to the Unfair Terms Directive in Recital 42 GDPR stating that pre-formulated declarations of consent should not contain unfair terms. Attention is paid to various interpretative difficulties stemming from this alignment between the two instruments. In essence, this debate circles the thorny issue of the economic value of personal data and thus tries to navigate the interpretation minefield left behind by the cross-reference.

“Pre-Formulated Declarations of Data Subject Consent – Citizen-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections” by D. Clifford, I. Graef, and P. Valcke Pre-Formulated Declarations of Data Subject Consent – Citizen-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections (February 20, 2018). CiTiP Working Paper 33/2017.

 

The Importance of Privacy by Design and Data Protection Impact Assessments in Strengthening Protection of Children’s Personal Data Under the GDPR

S. VAN DER HOF, E. LIEVENS

Authors’ Abstract

This paper explores to what extent the current illusion of autonomy and control by data subjects, including children and parents, based on consent can potentially be mitigated, or even reversed, by putting more emphasis on other tools of protection and empowerment in the GDPR and their opportunities for children. Suggestions are put forward as to how the adoption of such tools may enhance children’s rights and how they could be put into practice by DPAs and data controllers.

“The Importance of Privacy by Design and Data Protection Impact Assessments in Strengthening Protection of Children’s Personal Data Under the GDPR” by S. van der Hof, E. Lievens Communications Law 2018, Vol. 23, No. 1.

 

Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation

G. MALGIERI, G. COMANDÉ

This papers analyzes the GDPR’s “right to explanation.” The authors make a clear distinction between different levels of information and of consumers’ awareness;  they propose a new concept — algorithmic “legibility” — focused on combining transparency and comprehensibility.

The authors argue that a systemic interpretation is needed in this field. They show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary and recommend a “legibility test” that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in automated decision-making.

Authors’ Abstract

The aim of this contribution is to analyse the real borderlines of the ‘right to explanation’ in the GDPR and to discretely distinguish between different levels of information and of consumers’ awareness in the ‘black box’ society. In order to combine transparency and comprehensibility we propose the new concept of algorithm ‘legibility’.

We argue that a systemic interpretation is needed in this field, since it can be beneficial not only for individuals but also for businesses. This may be an opportunity for auditing algorithms and correcting unknown machine biases, thus similarly enhancing the quality of decision-making outputs.

Accordingly, we show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary, considering in particular that: the threshold of minimum human intervention required so that the decision-making is ‘solely’ automated (Article 22(1)) can also include nominal human intervention; the envisaged ‘significant effects’ on individuals (Article 22(1)) can encompass as well marketing manipulation, price discrimination, etc; ‘meaningful information’ that should be provided to data subjects about the logic, significance and consequences of decision-making (Article 15(1)(h)) should be read as ‘legibility’ of ‘architecture’ and ‘implementation’ of algorithmic processing; trade secret protection might limit the right of access of data subjects, but there is a general legal favour for data protection rights that should reduce the impact of trade secrets protection.

In addition, we recommend a ‘legibility test’ that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in an automated decision-making.

“Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation” by G. Malgieri, G. Comandé International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 243–265.