The winners of the 2018 PPPM Award are:
by Danielle Keats Citron, Morton & Sophia Macht Professor of Law, University of Maryland Carey School of Law
Those who wish to control, expose, and damage the identities of individuals routinely do so by invading their privacy. People are secretly recorded in bedrooms and public bathrooms, and “up their skirts.” They are coerced into sharing nude photographs and filming sex acts under the threat of public disclosure of their nude images. People’s nude images are posted online without permission. Machine-learning technology is used to create digitally manipulated “deep fake” sex videos that swap people’s faces into pornography.
At the heart of these abuses is an invasion of sexual privacy—the behaviors and expectations that manage access to, and information about, the human body; intimate activities; and personal choices about the body and intimate information. More often, women, nonwhites, sexual minorities, and minors shoulder the abuse.
Sexual privacy is a distinct privacy interest that warrants recognition and protection. It serves as a cornerstone for sexual autonomy and consent. It is foundational to intimacy. Its denial results in the subordination of marginalized communities. Traditional privacy law’s efficacy, however, is eroding just as digital technologies magnify the scale and scope of the harm. This Article suggests an approach to sexual privacy that focuses on law and markets. Law should provide federal and state penalties for privacy invaders, remove the statutory immunity from liability for certain content platforms, and work in tandem with hate crime laws. Market efforts should be pursued if they enhance the overall privacy interests of all involved.
by Lilian Edwards, Professor of Law, Innovation and Society, Newcastle Law School; and Michael Veale, Researcher, Department of Science, Technology, Engineering & Public Policy at University College London
by Ira Rubinstein, Senior Fellow, Information Law Institute of the New York University School of Law
by Ari Ezra Waldman, Professor of Law and Founding Director, Innovation Center for Law and Technology at New York Law School
In Privacy on the Ground, the law and information scholars Kenneth Bamberger and Deirdre Mulligan showed that empowered chief privacy officers (CPOs) are pushing their companies to take consumer privacy seriously by integrating privacy into the designs of new technologies. Their work was just the beginning of a larger research agenda. CPOs may set policies at the top, but they alone cannot embed robust privacy norms into the corporate ethos, practice, and routine. As such, if we want the mobile apps, websites, robots, and smart devices we use to respect our privacy, we need to institutionalize privacy throughout the corporations that make them. In particular, privacy must be a priority among those actually doing the work of design on the ground—namely, engineers, computer programmers, and other technologists.
This Article presents the initial findings from an ethnographic study of how, if at all, those designing technology products think about privacy, integrate privacy into their work, and consider user needs in the design process. It also looks at how attorneys at private firms draft privacy notices for their clients and interact with designers. Based on these findings, this Article suggests that Bamberger’s and Mulligan’s narrative is not yet fully realized. The account among some engineers and lawyers, where privacy is narrow, limited, and barely factoring into design, may help explain why so many products seem to ignore our privacy expectations. The Article then proposes a framework for understanding how factors both exogenous (theory and law) and endogenous (corporate structure and individual cognitive frames and experience) to the corporation prevent the CPOs’ robust privacy norms from diffusing throughout technology companies and the industry as a whole. This framework also helps suggest how specific reforms at every level—theory, law, organization, and individual experience—can incentivize companies to take privacy seriously, enhance organizational learning, and eliminate the cognitive biases that lead to discrimination in design.
The 2018 PPPM Honorable Mentions are:
- Regulating Bot Speech, by Madeline Lamo, Law Clerk, United States Court of Federal Claims; and Ryan Calo, Lane Powell and D. Wayne Gittinger Associate Professor of Law, University of Washington School of Law
We live in a world of artificial speakers with real impact. Bots foment political strife, skew online discourse, and manipulate the marketplace. In response to concerns about the unique threats bots pose, legislators have begun to pass laws that require online bots to clearly indicate that they are not human. This work is the first to consider how such efforts to regulate bots might raise concerns about free speech and privacy.
While requiring a bot to self-disclose does not censor speech as such, it may nonetheless infringe upon the right to speak – including the right to speak anonymously – in the digital sphere. Specifically, complexities in the enforcement process threaten to unmask anonymous speakers, and requiring self-disclosure creates a scaffolding for censorship by private actors and other governments.
Ultimately, bots represent a diverse and emerging medium of speech. Their use for mischief should not overshadow their novel capacity to inform, entertain, and critique. We conclude by providing policymakers with a series of principles to bear in mind when regulating bots, so as not to inadvertently curtail an emerging form of expression or compromise anonymous speech.
- The Intuitive Appeal of Explainable Machines by Andrew D. Selbst, Postdoctoral Scholar, Data & Society Research Institute; and Solon Barocas, Assistant Professor, Department of Information Science at Cornell University
Algorithmic decision-making has become synonymous with inexplicable decision-making, but what makes algorithms so difficult to explain? This Article examines what sets machine learning apart from other ways of developing rules for decision-making and the problem these properties pose for explanation. We show that machine learning models can be both inscrutable and nonintuitive and that these are related, but distinct, properties.
Calls for explanation have treated these problems as one and the same, but disentangling the two reveals that they demand very different responses. Dealing with inscrutability requires providing a sensible description of the rules; addressing nonintuitiveness requires providing a satisfying explanation for why the rules are what they are. Existing laws like the Fair Credit Reporting Act (FCRA), the Equal Credit Opportunity Act (ECOA), and the General Data Protection Regulation (GDPR), as well as techniques within machine learning, are focused almost entirely on the problem of inscrutability. While such techniques could allow a machine learning system to comply with existing law, doing so may not help if the goal is to assess whether the basis for decision-making is normatively defensible.
In most cases, intuition serves as the unacknowledged bridge between a descriptive account and a normative evaluation. But because machine learning is often valued for its ability to uncover statistical relationships that defy intuition, relying on intuition is not a satisfying approach. This Article thus argues for other mechanisms for normative evaluation. To know why the rules are what they are, one must seek explanations of the process behind a model’s development, not just explanations of the model itself.
The 2018 PPPM Student Paper Winner Is:
- Diffusion of User Tracking Data in the Online Advertising Ecosystem, by Muhammad Ahmad Bashir, PhD candidate, College of Computer and Information Science at Northeastern University; and Christo Wilson, Associate Professor, College of Computer and Information Science at Northeastern University
There are two trends that are currently reshaping the online display advertising industry. First, the amount and precision of data that is being collected by Advertising and Analytics (A&A) companies about users as they browse the web is increasing. Second, there is a transition underway from “ad networks” to “ad exchanges”, where advertisers bid on “impressions” (empty advertising slots on websites) being sold in Real Time Bidding (RTB) auctions. The rise of RTB has forced A&A companies to collaborate with one another, in order to exchange data about users and facilitate bidding on impressions.
These trends have fundamental implications for users’ online privacy. It is no longer sufficient to view each A&A company, and the data it collects, in isolation. Instead, when a given user is observed by a single A&A company, that observation may be shared, in real time, with hundreds of other A&A companies within RTB auctions.
To understand the impact of RTB on users’ privacy, we propose a new model of the online advertising ecosystem called an Interaction Graph. This graph captures the business relationships between A&A companies, and allows us to model how tracking data is shared between companies. Using our Interaction Graph model, we simulate browsing behavior to understand how much of a typical web user’s browsing history can be tracked by A&A companies. We find that 52 A&A companies are each able to observe 91% of an average user’s browsing history, under modest assumptions about data sharing in RTB auctions. 636 A&A companies are able to observe at least 50% of an average user’s browsing history. Even under very strict simulation assumptions, the top 10 A&A companies still observe 89-99% of an average user’s browsing history.
Additionally, we investigate the effectiveness of several tracker-blocking strategies, including those implemented by popular privacy-enhancing browser extensions. We find that AdBlock Plus (the world’s most popular ad blocking browser extension), is ineffective at protecting users’ privacy because major ad exchanges are whitelisted under the Acceptable Ads program. In contrast, Disconnect blocks the most information flows to A&A companies of the extensions we evaluated. However, even with strong blocking, major A&A companies still observe 40-80% of an average users’ browsing history.
Read more about the winners in the Future of Privacy Forum’s December 17, 2018 Press Release on the Annual Privacy Papers for Policymakers Award.