For the tenth year, FPF’s annual Privacy Papers for Policymakers program is presenting to lawmakers and regulators award-winning research representing a diversity of perspectives. Among the papers to be honored at an event at the Hart Senate Office Building on February 6, 2020 are two papers broadly addressing the impact of algorithms on transparency and fairness: Antidiscriminatory Privacy by Ignacio N. Cofone and Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations by Margot E. Kaminski and Gianclaudio Malgieri. Cofone assesses how privacy rules can both facilitate and protect against discriminatory behavior, while Kaminski and Malgieri discuss how impact assessments serve to link the individual and systemic regulatory subsystems within the European General Data Protection Regulation (GDPR).
Law often blocks sensitive personal information to prevent discrimination. In Antidiscriminatory Privacy, Ignacio Cofone, assistant professor at McGill University Faculty of Law, presents a framework for reducing discrimination against minorities. To build this framework, Cofone explored two case studies that “illustrate when rules that regulate the flow of personal information (privacy rules) are compatible with antidiscrimination efforts and when they are not.” Through an analysis of anonymous orchestra auditions and the “Ban the Box” initiative, Cofone reveals how blocking an information flow can be successful at combatting discrimination in some cases, but not all of them. Cofone states that “privacy can protect against discrimination as well as enable a discriminatory dynamic.” He notes that certain data points may serve as proxies for categories that the law aims to protect, arguing that information about certain proxies must be blocked when employing privacy rules to fight discrimination. In the case of the “Ban the Box” initiative, for example, when employers were prohibited from asking about an applicant’s criminal history, they were more likely to discriminate against black applicants they thought might have criminal histories. Cofone found that in the “Ban the Box” case, applicants’ race became a proxy for criminal history, fostering discriminatory behavior. Cofone’s analysis offers a framework for determining the effectiveness of antidiscrimination measures based on information restrictions, including questions to consider to identify proxies for protected information.
In Algorithmic Impact Assessments under the GDPR: Producing Multi-layered Explanations, authors Margot Kaminski of Colorado Law School and Gianclaudio Malgieri of Vrije Universiteit Brussels, propose that impact assessments should link the GDPR’s dual methods of regulating algorithmic decision-making by providing systemic governance and also safeguarding individual privacy rights. The authors state that, in the context of decision-making algorithms, the GDPR’s existing Data Protection Impact Assessment (DPIA) should serve as an Algorithmic Impact Assessment that addresses problems of algorithmic discrimination, bias, and unfairness. Beyond serving as a tool in the GDPR’s systemic governance regime, the authors state that the DPIA should serve as an element of the GDPR’s protection of individual rights, connecting the two regulatory subsystems that underline the GDPR. The way that the DPIA links these two subsystems within the GDPR, the authors note, mandates the creation of “multi-layered explanations” for algorithmic decision-making that are targeted to everything from oversight bodies and auditors to individuals. Privacy professionals will benefit from the authors’ suggestions for improving Algorithmic Impact Assessments in the GDPR context, calling for the expansion of the right to explanation to include a “whole web of explanations…of differing degrees of breadth, depth, and technological complexity.”
The Privacy Papers for Policymakers project’s goal is to put diverse academic perspectives in front of policymakers to inform the development of privacy legislation. You can view all of this year’s award-winning papers on the FPF website.