FPF Applauds Department of Commerce For Safe Harbor Website Revision

The Department of Commerce has long listed companies’ participation in the US-EU Safe Harbor program in the Safe Harbor List. Within that list, a significant number of companies are marked with the designation “not current.” As FPF wrote in its paper discussing the Safe Harbor, a company can be listed as “not current” for a number of reasons: they may have failed to fill out specific yearly paperwork, chosen to use other approved data transfer mechanisms, merged with another company, ceased data transfer with the EU, or shut down altogether. However, critics of the Safe Harbor say many companies are claiming to be members while in fact they are not adhering to the Safe Harbor agreement.

FPF noted that a company’s obligations under the Safe Harbor do not end even if the company is listed as non-current: rather, they remain responsible for adhering to the Safe Harbor Principles with respect to all the data they transferred while enjoying the benefits of Safe Harbor membership.  When the European Commission recommended that “[t]he Department of Commerce should clearly indicate on its website all companies which are not current members.” FPF agreed and suggested that the Department of Commerce should also include on its website an explanation why a company may be listed as “not current” in order to clear up any potential confusion.

FPF is pleased that the Department of Commerce’s Safe Harbor website was updated in late 2013 with a new notice that makes clear that companies may be listed as non-current for a number of reasons, but are nonetheless subject to FTC enforcement for claiming to be members without adhering to the Safe Harbor Principles. The new notice reads:

“Notice: An organization may be designated as “Not Current” for a variety of reasons. The most common reason is that the organization has failed to reaffirm its adherence to the Safe Harbor Privacy Principles on an annual basis as required by the Safe Harbor Frameworks. Another possible reason is that the organization has failed to comply with one or more of the Safe Harbor Privacy Principles. Organizations designated as “Not Current” are no longer assured of the benefits of the Safe Harbor (i.e., the presumption of “adequacy”). These organizations nevertheless must continue to apply the Safe Harbor Privacy Principles to the personal data received during the period in which they were assured of the benefits of the Safe Harbor for as long as they store, use or disclose those data. Any misrepresentation by an organization designated as “Not Current” concerning its adherence to the Safe Harbor Privacy Principles may be actionable by the Federal Trade Commission or other relevant government body.”

FPF applauds the Department of Commerce for these revisions. We will continue to monitor developments relating to the US-EU Safe Harbor Agreement as they arise.

FPF In the News: a Big Week for Panels and Privacy

The first week of March brought with it a number of great privacy-related events, some run by the IAPP and some hosted by others (including FPF itself!). Below are links to the many events FPF participated in.

Privacy Papers for Policy Makers Launch Event

In conjunction with Congresswoman Sheila Jackson Lee, FPF presented our fourth annual “Privacy Papers for Policy Makers” this Wednesday. Featured at the event was Professor Kenneth Bamberger from Berkeley, who discussed his paper with Deirdre Mulligan, Privacy in Europe: Initial Data on Governance Choices and Corporate Practice. Professor Neil Richards discussed his paper on why data privacy law is (mostly) constitutional, while Adam Thierer presented A Framework for Benefit-Cost Analysis in Digital Privacy Debates.

pppm1

@Microsoft Conversation on Privacy: “Privacy Models: The Next Evolution”

FPF Executive Director Jules Polonetsky moderated a panel for a lunch conversation between leading experts, who discussed future privacy principles and frameworks that focus on data use and associated risks. The panelists discussed the ways society can protect the privacy of individuals while providing for responsible, beneficial data use.

IAPP: FTC Privacy and Data Security Jurisprudence

FPF Co-Chair Chris Wolf participated on a panel moderated by FPF Senior Fellow Omer Tene on the FTC’s developing “common law of privacy,” which serves as an invaluable reference and guidance tool for corporate data managers not only in the U.S. but also globally. The IAPP Westin Research Center has embarked on a project to collate, index, annotate and make available to policymakers and practitioners a “Comprehensive Casebook of FTC Privacy and Information Security Law.” In this session, Chris, Omer and others discussed the findings of the project and initial conclusions with senior FTC staff.

IAPP: Ed Tech, Data and Student Privacy

FPF Executive Director Jules Polonetsky, FPF Board Members Larry Magid and Andy Bloom, and Kathleen Styles, Chief Privacy Officer, U.S. Department of Education participated in a panel about the impact of new education technologies on student privacy. New technologies and data are being used for a variety of services in schools, from administrative uses such as managing class schedules, buses and registration, to educational tools such as remote learning and personalized curricula. Data is increasingly being shared with third parties, and apps and tablets are increasingly essential to the learning environment. The confluence of enhanced data collection with highly sensitive information about children and teens creates privacy risks: FPF has recently organized a working group of companies and other experts to work on crafting solutions to this hot issue. Contact FPF to get involved.

ed tech

IAPP: Governmental Access to Private-sector Data: The Realities and Impacts in the U.S. and EU

FPF Co-Chair Chris Wolf moderated a panel discussing government access to private-sector data: what actually is being exposed in the U.S. and in the EU; what checks and oversight exists in the various jurisdictions; what those holding the data and those whose data is held can do to address privacy and free expression concerns; and what impact the publicity over national security access is having on public policy and international relations.

IAPP: From 0–60: Privacy and the New Generation of Connected Cars

FPF Policy Director Josh Harris moderated a panel on the many new developments in the world of connected cars. The panel explored the new technologies and their risks for the privacy of individuals, and demonstrated best practices and solutions for ensuring compliance and transparency within the connected automobile environment. Access Josh’s presentation here.

IAPP: Judge, Jury and Executioner: Are Federal Courts Giving Privacy Class Actions a Fair Chance?

FPF Co-Chair Chris Wolf moderated a panel describing the struggles facing class action plaintiffs in the privacy field. The panel, which brought together some of the leading plaintiff and defendant attorneys in the country, discussed legal theories of harm and standing, proof of causation, commonality and the likelihood that a plaintiff’s injury will be redressed by a favorable decision.

IAPP: Eraser Buttons, the Right to Delete and the Rise of Tech Solutions for Ephemeral Data

FPF Executive Director Jules Polonetsky moderated a panel on California’s new “eraser button” law, which requires certain websites to allow minors to remove embarrassing postings. The panel covered similar legislative efforts in the U.S. and E.U., as well as the growing trend in consumer technology for “ephemeral” messaging services such as Frankly, SnapChat, and Whisper.

White House/MIT Big Data Privacy Workshop Recap

Speaking for everyone snowed-in in DC, White House Counselor John Podesta remarked that “big snow trumped big data,” while on the phone to open the first of the Obama Administration’s three big data and privacy workshops.  This first workshop focused on advancing the “start of the art” in technology and practice.  While these workshops are ultimately the product of Edward Snowden’s NSA leaks last year, Mr. Podesta explained that his big data review group was conducting a broad review on a “somewhat separate track” from an ongoing review of the intelligence community. His remarks focused on several specific example of the social value of data, but he cautioned that “we need to be conscious of the implications for individuals. How should we think about individuals’ sense of their identity when data reveals things they didn’t even know about themselves?”

To that end, he noted that “we can’t wait to get privacy perfect to get going,” and noting this workshop was designed to talk about technology around data, he hoped the workshop would help inform the Administration about what it needs to take away about the state of data privacy right now.

Cynthia Dwork, from Microsoft Research, followed Mr. Podesta with a deep-dive into differential privacy. In English, as she put IT, differential privacy works to ensure that the outcome of any analysis is equally likely, independent of whether an individual join or does not join a database. The goal is to limit the range of potential harms to any individual from participating in data analysis. The challenge posed by big data is that multiple uses of data create a cumulatively harm to privacy, which is difficult to measure. Overly accurate estimates of too much information are “blatantly non-private,” Dwork argued.

While Dwork focused on new technologies to advance privacy, a slate of MIT professors presented brief examples of how big data is providing big social benefits in health care, transportation, and education:

When the floor was opened for questions, a skeptic in the crowd noted that one of the biggest drivers of data collection is not social benefits, but rather to make money. Mr. Agarwal suggested that was the very reason edX was a non-profit was in order for its use of sensitive data “to be judged by different criteria than maximizing return on investment.”

Secretary of Commerce Penny Pritzker suggested that harnessing the potential of data would  hinge upon user trust. Highlighting Commerce’s efforts to advance multistakeholder codes of conduct and ensure the efficacy of the U.S.-EU Safe Harbor, Ms. Pritzker suggested government needed to continually evaluate and work with companies to uncover the technologies and practices that promote trust. She expressed hope that efforts like the day’s workshop could help to show that confidence placed in American companies should remain “rock solid.”

The program’s afternoon shifted to a broad discussion of privacy enhancing technologies (PETs), specifically developments in differential privacy, encryption, and accountability systems. There was a recognition that with any computer system that compromises in security and privacy are inevitable — complex software will have bugs, many different people will need access to infrastructure, and interesting data processing will require the use of confidential information or PII.

Danny Weitzner lamented a better definition of privacy for computer designers and engineers to build toward. Alan Westin’s original definition, that privacy is a claim by an individual, groups, or institutions to determine for themselves when, how, and to what extent their information can be communicated to others, has “led us astray,” Prof. Weitzner argued. He argued that throughout the day multiple substantive definitions of privacy had come up in discussion, and he argued that we “need a way to know what’s going on” in order to “allow data for some purposes, but won’t be misused for others.”

Quoting Judge Reggie B. Walton about the challenges facing the FISA Court, Weitzner noted that “we don’t currently have the tools in everyday systems to assess how information is used.” Weitzner discussed his work on information accountability.

Weitzner then led a large hypothetical discussion where MIT in the near-future “embrace[s] the potential of data-powered, analytics-driven systems in all aspects of campus life, from education to health care to community sustainability.” Weitzner asked a slate of panelists what they would do as the future chief privacy officer of MIT, and Intel’s David Hoffman suggested that we all need to understand “that a lot of the data about us that’s now out there is not coming from us.”  As a result, meaningful transparency must mean more than notice to individuals. Panelists then hit a wide-gamut of issues from the ethical challenges around predictive analysis and the need to get serious about addressing questions about use, teeing up the Administration’s next workshop on the ethics of big data.

Privacy Papers on Capitol Hill — March 5

In conjunction with Congresswoman Sheila Jackson Lee, FPF will be presenting our fourth annual “Privacy Papers for Policy Makers” next Wednesday, March 5th.  The event will be held in Rayburn House Office Building Room 2103 from 8:30 – 9:45 AM, coffee and breakfast will be provided. Event is sold out.

Featured at the event will be Professors Kenneth Bamberger and Deirdre Mulligan from Berkeley, who will be discussing their paper Privacy in Europe: Initial Data on Governance Choices and Corporate Practice.  Professor Neil Richards will discuss his paper on why data privacy law is (mostly) constitutional, while Adam Thierer will present A Framework for Benefit-Cost Analysis in Digital Privacy Debates.

FPF is also pleased to have Jacob Kohnstamm, Chairman, Dutch Data Protection Authority, join us to provide reaction.  Additionally, special guests Giovanni Buttarelli, Asst. European Data Protection Supervisor; Christopher Graham, UK Information Commissioner; Isabelle Falque-Pierrotin, CNIL (France); and María Elena Pérez-Jaén Zermeño, IFAI (Mexico), will be attending.

This event is intended to comply with applicable Congressional and Executive branch gift rules. Contact us with questions.

The full “Privacy Papers” digest is available to download, as well.

Aislelabs named Privacy by Design Ambassador

We were excited to learn that Aislelabs, a member of FPF’s  Mobile Location Analytics privacy working group has been named a Privacy by Design Ambassador by the Information and Privacy Commissioner of Ontario. Like fellow PbD Ambassador Euclid Analytics, Aislelabs has signed on to our Mobile Location Analytics (MLA) Code of Conduct, which ensures that consumers are provided with transparency and choice as to whether MLA companies may collect their information. As the launch date of our central opt-out site fast approaches, we’re glad to see member companies being recognized for their commitment to consumer privacy in this space.

(via PRWeb)

Jules, Omer and Chris Discuss the Challenges of Big Data and Consumer Review Boards

FPF’s Co-Chair and Executive Director Jules Polonetsky, Senior Fellow Omer Tene, and Co-Chair Christopher Wolf discussed the challenges facing President Obama with respect to big data in a new post for the IAPP.  The post argues that balancing the benefits of data analytics against attendant risks to civil liberties presents the biggest public policy challenge of our time.

FPF is currently developing a toolkit designed to help privacy professionals perform a comprehensive, rigorous cost-benefit analysis to determine how best to pursue their big data goals. Companies should have a clear framework to use in order to evaluate how a data-driven project will affect their consumers. Recent articles about “people analytics” to guide hiring practices to a school’s tracking of its students add additional examples that make it clear that a “practical application of fair information principles that accounts for modern day realities of collection and use” has become increasingly necessary.

One potential path forward involves the creation of “Consumer Subject Review Boards,” an idea discussed by Ryan Calo.* Such review boards would assess and evaluate big data projects’ rewards and associated risks. They would play an instrumental role in revitalizing consumer trust and mitigating some of the risks associated with innovative uses of consumer data. However, there are still questions that must be answered before such review boards could be deployed in practice.

First, what type of issues would a review board address? Would it be focused on addressing only privacy dilemmas, or would it seek to anticipate other ethical issues? As Evan Selinger and Patrick Lin write: “A technology ethics board . . . can be an invaluable canary in the coalmine—scouting for explosive issues in advance of emerging technology and before the law eventually turns its attention to these new problems and the company itself.” Clearly, a review board would need to have a clear understanding of its proper subject-matter scope.

Another question is whether the review boards should be in-house or independent. An in-house review board would benefit from close familiarity with its company’s data practices, but would lack the credibility of an independent entity. Similarly, should the opinions of a review board be confidential or publicly available? Confidential opinions do not instill as much consumer trust; however, public opinions risk being watered down to mitigate future litigation risks, and could potentially chill valuable innovation.  Some companies have privacy advisory boards already – is a Consumer Review Board a more formal example of a privacy board?  What methodology will the members use?

We look forward to continuing to explore this promising idea.

 



* See also Malcolm Crompton’s suggestion of ethics boards for privacy issues.

Peter Swire: Why Tech Companies and the NSA Diverge on Snowden

FPF Senior Fellow Peter Swire has an op-ed in today’s Washington Post that discusses how tech companies and the intelligence community are grappling with the traitor-or-whistleblower debate when it comes to Edward Snowden.  His conclusion suggests the debate provokes a much broader set of issues:

Fundamentally, the traitor-or-whistleblower debate comes down to different views of what values should be paramount in governing the Internet we all use. The Internet is where surveillance happens to keep our nation safe. It is also where we engage in e-commerce and express ourselves in infinite ways. The goal is to create one communications structure that safeguards diverse, important values.

Essays on Big Data and Privacy

Solutions to many pressing economic and societal challenges lie in better understanding data. New tools for analyzing disparate information sets, called Big Data, have revolutionized our ability to find signals amongst the noise. Big Data techniques hold promise for breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing. Yet, privacy advocates are concerned that the same advances will upend the power relationships between government, business and individuals, and lead to prosecutorial abuse, racial or other profiling, discrimination, redlining, overcriminalization, and other restricted freedoms.

On Tuesday, September 10th, 2013, the Future of Privacy Forum joined with the Center for Internet and Society at Stanford Law School to present a full-day workshop on questions surrounding Big Data and privacy.  The event was preceded by a call for papers discussing the legal, technological, social, and policy implications of Big Data. A selection of papers was published in a special issue of the Stanford Law Review Online and others were presented at the workshop. This volume collects these papers and others in a single collection.

These essays address the following questions: Does Big Data present new challenges or is it simply the latest incarnation of the data regulation debate? Does Big Data create fundamentally novel opportunities that civil liberties concerns need to accommodate? Can de-identification sufficiently minimize privacy risks? What roles should fundamental data privacy concepts such as consent, context, and data minimization play in a Big Data world? What lessons can be applied from other fields?

We hope the following papers will foster more discussion about the benefits and challenges presented by Big Data—and help bring together the value of data and privacy, as well.

Download the complete PDF.

PBS NewsHour: Jules Polonetsky Talks Big Data and Privacy

Last night, Jules Polonetsky was featured on a segment on PBS NewsHour discussing, “What’s the future of privacy in a big data world?” He was joined by Adam Thierer, senior research fellow at the Mercatus Center at George Mason University. The video and the transcript can be found here.

Comments to the FCC About "Anonymized" and "Deidentification"

Yesterday, the Federal Communications Commission posted FPF’s comments about “anonymization” and “deidentification.”  The comments come in response to a request from Public Knowledge that the FCC clarify whether “anonymized” or “deidentified” but non-aggregate call records constitute individually identifiable “customer proprietary network information” under Section 222 of the Communications Act.

FPF submitted  comments to address the argument that all anonymized records must be considered “personally identifiable” records because there have been instances in which some publicly available, anonymized records have been reidentified.  Public Knowledge argues that because researchers have been able to reidentify some publicly disclosed data sets that were purged of personally identifying information, all datasets that have been purged of personally identifying information must necessarily be considered individually identifiable.  FPF responded:

Logically, this argument is flawed. It is analogous to the argument that because some locks have been broken, there is no such thing as a reasonably secure door.

Although reidentification may be possible in some specific circumstances, when proper anonymization practices are used, anonymization is a valuable and effective way to advance the goal of protecting individual privacy while allowing for beneficial uses of data.  The full set of comments is available to read.