Privacy Scholarship Research Reporter: Issue 4, December 2018 – GDPR in Focus
Notes from FPF
The General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) aims to guarantee strong protections for individuals regarding their personal data and apply to businesses that collect, use, or share consumer data, whether the information was obtained online or offline.
The GDPR went into effect on May 25, 2018 and is one of the most comprehensive data protection laws in the world to date. The law represents the most comprehensive data protection reform in a generation. Its geographic scope extends far beyond the borders of Europe, and material scope reaches across all industries – including online services, mobile, cloud, IoT, financial services, healthcare, and telecom.
In this issue are articles that provide highlight key issues raised by the GDPR: how does the Article 20 right to data portability address (or not address) privacy concerns about onward transfer of personal information? How might privacy risks raised by the internet of things be mitigated by a GDPR-compliant transparency model? How might the GDPR right to explanation be implemented in a flexible and practical manner? How do the GDPR and ePrivacy Directive intersect with the standard contractual terms used by many online services? How can children’s privacy rights be best supported by the GDPR? Is the right to “legibility” the appropriate way to interpret and apply GDPR’s rights to explanation? The papers highlighted in this issue engage with these questions and more.
As always, we would love to hear your feedback on this issue. You can email us at [email protected].
Data Portability and Data Control: Lessons for an Emerging Concept in EU Law
I. GRAEF, M. HUSOVEC
This article observes that while Article 20 of the GDPR introduces the right to data portability, it is agnostic as it relates to how this data can be used once transferred. The authors state that unlike other initiatives, the right to data portability does not create ownership control over the ported data. How this regulation will be limited in that it may clash with the intellectual property rights of some current data holders (i.e. copyright, trade secrets etc) is discussed. The authors argue that as other regimes try to replicate the right to data portability, they should consider the resulting control, its breadth and its impact on incentives to innovate.
Authors’ Abstract
The right to data portability (‘RtDP’) introduced by Article 20 of the General Data Protection Regulation (‘GDPR’) is a first regulatory attempt to establish a general-purpose control mechanism of horizontal application which mainly aims to facilitate reuse of personal data held by private companies. Article 20 GDPR is agnostic about the type of use that follows from the ported data and its further diffusion. This contrast with forms of portability facilitated under competition law which can only occur for purpose-specific goals with the aim of addressing anticompetitive behaviour. Unlike some upcoming initiatives, the RtDP still cannot be said to create ownership-like control over ported data. Even more, this regulatory innovation will be limited in its aspirations where intellectual property rights of current data holders, such as copyright, trade secrets and sui generis database rights, cause the two regimes to clash. In such cases, a reconciliation of the interests might confine particularly the follow-on use of ported data again to specific set of socially justifiable purposes, possibly with schemes of fair remuneration. We argue that to the extent that other regimes will try to replicate the RtDP, they should closely consider the nature of the resulting control, its breadth and its impact on incentives to innovate. In any case, the creation of data portability regimes should not become an end in itself. With an increasing number of instruments, orchestrating the consistency of legal regimes within the Digital Single Market and their mutual interplay should become an equally important concern.
“Data Portability and Data Control: Lessons for an Emerging Concept in EU Law” by I. Graef, M. Husovec TILEC Discussion Paper No. 2017-041 Tilburg Law School Research Paper No. 2017/22.
GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy
S. WACHTER
Presented in this paper is a three-step transparency model based on known privacy risks of the IoT, the GDPR’s governing principles, and weaknesses in its relevant provisions. In an effort to help IoT developers and data controllers, eleven ethical guidelines are proposed focused on how information about the functionality of the IoT should be shared with users above the GDPR’s legally binding requirements. There are two case studies presented that demonstrate how the guidelines apply in practice: IoT in public spaces and connected cities, and connected cars.
Authors’ Abstract
The Internet of Things (IoT) requires pervasive collection and linkage of user data to provide personalised experiences based on potentially invasive inferences. Consistent identification of users and devices is necessary for this functionality, which poses risks to user privacy. The forthcoming General Data Protection Regulation (GDPR) contains numerous provisions relevant to these risks, which may nonetheless be insufficient to ensure a fair balance between users’ and developers’ interests. A three-step transparency model is described based on known privacy risks of the IoT, the GDPR’s governing principles, and weaknesses in its relevant provisions. Eleven ethical guidelines are proposed for IoT developers and data controllers on how information about the functionality of the IoT should be shared with users above the GDPR’s legally binding requirements. Two use cases demonstrate how the guidelines apply in practice: IoT in public spaces and connected cities, and connected cars.
“GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy” by S. Wachter Wachter, Sandra, GDPR and the Internet of Things: Guidelines to Protect Users’ Identity and Privacy (February 5, 2018).
Meaningful Information and the Right to Explanation
A. D, SELBST, J. POWLES
The authors believe the discourse about the right to explanation has, thus far, gone in an unproductive direction. The authors posit that there is a fierce disagreement over whether these provisions create a data subject’s ‘right to explanation’. This article attempts to reorient that debate by showing that the plain text of the GDPR supports such a right. The authors believe that the right to explanation should be interpreted functionally, flexibly, and at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law. To make their point, they offer a critique of the two most prominent papers in the debate.
Authors’ Abstract
There is no single, neat statutory provision labelled the ‘right to explanation’ in Europe’s new General Data Protection Regulation (GDPR). But nor is such a right illusory.
Responding to two prominent papers that, in turn, conjure and critique the right to explanation in the context of automated decision-making, we advocate a return to the text of the GDPR.
Articles 13–15 provide rights to ‘meaningful information about the logic involved’ in automated decisions. This is a right to explanation, whether one uses the phrase or not.
The right to explanation should be interpreted functionally, flexibly, and should, at a minimum, enable a data subject to exercise his or her rights under the GDPR and human rights law.
“Meaningful Information and the Right to Explanation” by A. D, Selbst, J. Powles International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 233–242.
Pre-Formulated Declarations of Data Subject Consent – Citizens-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections
D. CLIFFORD, I. GRAEF, AND P. VALCKE
This article examines how the respective data protection and privacy, consumer protection, and competition law policy agendas are aligned by looking through the lens of pre-formulated declarations of consent whereby data subjects agree to the processing of their personal data by accepting standard terms. The authors describe the role each area has as it relates to the GDPR and ePrivacy Directive, the Unfair Terms Directive, the Consumer Rights Directive and the proposed Digital Content Directive in addition to market dominance. This paper discusses the complicated issue of the economic value of personal data and tries to interpret the affects of this cross-reference.
Authors’ Abstract
The purpose of this article is to examine the alignment of the respective data protection and privacy, consumer protection and competition law policy agendas through the lens of pre-formulated declarations of consent whereby data subjects agree to the processing of their personal data by accepting standard terms. The article aims to delineate the role of each area with specific reference to the GDPR and ePrivacy Directive, the Unfair Terms Directive, the Consumer Rights Directive and the proposed Digital Content Directive in addition to market dominance. Competition law analysis is explored vis-à-vis whether it could offer indicators of when ‘a clear imbalance’ in controller-data subject relations may occur in the context of the requirement for consent to be ‘freely given’ as per its definition in the GDPR. This complements the data protection and consumer protection analysis which focuses on the specific reference to the Unfair Terms Directive in Recital 42 GDPR stating that pre-formulated declarations of consent should not contain unfair terms. Attention is paid to various interpretative difficulties stemming from this alignment between the two instruments. In essence, this debate circles the thorny issue of the economic value of personal data and thus tries to navigate the interpretation minefield left behind by the cross-reference.
“Pre-Formulated Declarations of Data Subject Consent – Citizen-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections” by D. Clifford, I. Graef, and P. Valcke Pre-Formulated Declarations of Data Subject Consent – Citizen-Consumer Empowerment and the Alignment of Data, Consumer and Competition Law Protections (February 20, 2018). CiTiP Working Paper 33/2017.
The Importance of Privacy by Design and Data Protection Impact Assessments in Strengthening Protection of Children’s Personal Data Under the GDPR
S. VAN DER HOF, E. LIEVENS
Authors’ Abstract
This paper explores to what extent the current illusion of autonomy and control by data subjects, including children and parents, based on consent can potentially be mitigated, or even reversed, by putting more emphasis on other tools of protection and empowerment in the GDPR and their opportunities for children. Suggestions are put forward as to how the adoption of such tools may enhance children’s rights and how they could be put into practice by DPAs and data controllers.
“The Importance of Privacy by Design and Data Protection Impact Assessments in Strengthening Protection of Children’s Personal Data Under the GDPR” by S. van der Hof, E. Lievens Communications Law 2018, Vol. 23, No. 1.
Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation
G. MALGIERI, G. COMANDÉ
This papers analyzes the GDPR’s “right to explanation.” The authors make a clear distinction between different levels of information and of consumers’ awareness; they propose a new concept — algorithmic “legibility” — focused on combining transparency and comprehensibility.
The authors argue that a systemic interpretation is needed in this field. They show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary and recommend a “legibility test” that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in automated decision-making.
Authors’ Abstract
The aim of this contribution is to analyse the real borderlines of the ‘right to explanation’ in the GDPR and to discretely distinguish between different levels of information and of consumers’ awareness in the ‘black box’ society. In order to combine transparency and comprehensibility we propose the new concept of algorithm ‘legibility’.
We argue that a systemic interpretation is needed in this field, since it can be beneficial not only for individuals but also for businesses. This may be an opportunity for auditing algorithms and correcting unknown machine biases, thus similarly enhancing the quality of decision-making outputs.
Accordingly, we show how a systemic interpretation of Articles 13–15 and 22 GDPR is necessary, considering in particular that: the threshold of minimum human intervention required so that the decision-making is ‘solely’ automated (Article 22(1)) can also include nominal human intervention; the envisaged ‘significant effects’ on individuals (Article 22(1)) can encompass as well marketing manipulation, price discrimination, etc; ‘meaningful information’ that should be provided to data subjects about the logic, significance and consequences of decision-making (Article 15(1)(h)) should be read as ‘legibility’ of ‘architecture’ and ‘implementation’ of algorithmic processing; trade secret protection might limit the right of access of data subjects, but there is a general legal favour for data protection rights that should reduce the impact of trade secrets protection.
In addition, we recommend a ‘legibility test’ that data controllers should perform in order to comply with the duty to provide meaningful information about the logic involved in an automated decision-making.
“Why a Right to Legibility of Automated Decision-Making Exists in the General Data Protection Regulation” by G. Malgieri, G. Comandé International Data Privacy Law, Volume 7, Issue 4, 1 November 2017, Pages 243–265.