European Commission's Safe Harbor Report Released

This morning, the European Commission released its long-awaited report evaluating the US-EU Safe Harbor. The Commission proposed series of recommendations “to restore trust in data flows between the EU and the U.S.”

The Future of Privacy Forum is currently preparing an in-depth report on the Safe Harbor that will address the concerns presented by the European Commission.  However, FPF believes the European Commission’s criticism is largely misplaced.  Christopher Wolf, FPF Co-Chair, suggests that the Commission’s analysis “does not reveal any significant deficiencies . . . It is clear that the main area of concern is national security access to data, which is not what the Safe Harbor was intended to or can address.”

 

FPF Responds to European Commission Report on US-EU Safe Harbor and Finds Criticisms Misplaced

 For immediate release, November 27, 2013

Future of Privacy Forum Responds to European Commission Report on US-EU Safe Harbor and Finds Criticisms Misplaced

Washington, D.C. November 27, 2013 – The Future of Privacy Forum (FPF), a think tank focused on advancing personal privacy, responded to the European Commission statements on the US-EU Safe Harbor program including a threat that the EU will terminate the agreement in mid-2014 unless certain conditions are met.  FPF cautioned against the precipitous termination of the program that has demonstrable benefits for the protection of EU personal data transferred from the EU to the US.  FPF is preparing an in-depth report on the Safe Harbor that will address the concerns presented by the European Commission.

Christopher Wolf, Founder and Co-Chair of FPF said:  “Regrettably, officials in the EU have conflated the issues raised by the recent NSA revelations with the issue of whether the Safe Harbor provides effective protection for the personal data of EU citizens.  The issue of national security access to personal data needs to be addressed separately, for example in government talks focused on that issue, not through threats to terminate a demonstrably effective framework for protecting privacy in commercial cross-border flows of personal data.  On first reading, the European Commission’s analysis of the operation of the Safe Harbor does not reveal any significant deficiencies, certainly not enough to call for termination of the cross-border data transfer arrangement.  It is clear that the main area of concern is national security access to data, which is not what the Safe Harbor was intended to or can address.”

Jules Polonetsky, Director and Co-Chair of FPF added:  “The Safe Harbor doesn’t protect against law enforcement or intelligence access, but it provides EU citizens with significant protections against consumer privacy abuses which the Commission risks undermining.  A forthcoming report by the Future of Privacy Forum will examine the efficacy of the Safe Harbor, including what it would mean for the privacy of European citizen data if the Safe Harbor were to be terminated.”

FPF recommends that rather than suspending the Safe Harbor, European and American policymakers should work together to strengthen the program and continue to address questions of national security in a separate context.

To schedule an interview with Christopher Wolf or Jules Polonetsky, email: [email protected]

A New Privacy Paradigm for the “Internet of Things”

Today the FTC is hosting a workshop on the Internet of Things, which will feature many great panelists including FPF’s Co-Chairman Christopher Wolf.  Chris and FPF Executive Director Jules Polonetsky have also today released a whitepaper arguing for a new privacy paradigm in the new highly connected world.

The whitepaper argues that current implementations of Fair Information Practice Principles (FIPPs) are becoming outdated in the world of the Internet of Things, where nearly every device or appliance will be connected to the internet and collecting data about consumers.  Attempting to provide meaningful “notice” in a world of billions of connected devices is not feasible when many devices lack meaningful user interfaces or screens, and relying on consumers to read thousands of Privacy Policies will lead to many simply “giving up” on their privacy. Similarly, FIPP’s strict usage limitations may thwart technological progress, because many socially valuable uses of data are not discovered until the data is already collected.  The challenge then is to allow practices that will support progress, while providing appropriate controls over those practices that should be forestalled or constrained by appropriate consent.

To that end, the paper proposes the following principles:

Use anonymized data when practical.  Anonymizing personal information decreases the risks that personally identifiable information will be used for unauthorized, malicious, or otherwise harmful purposes.  Although there is always some risk of Re-Identification, when data sets are anonymized and stored properly, re-identification is no easy task.

Respect the context in which personally identifiable information is collected.  Managing consumer expectations is a good first step; however, respect for context should not focus solely on what individuals “reasonably” expect.  There may be unexpected new uses that turn out to be valuable societal advances or important new ways to use a product or service.  Rigidly and narrowly specifying context could trap knowledge that is available and critical to progress. Finding a balance may require more sophisticated privacy impact assessments that can analyze the impact of risks or harms and assess the potential benefits for individuals and society.

Be transparent about data use.  Organizations making decisions that affect individuals should, whenever feasible, disclose the high-level criteria used when making those decisions.  This will help insure that factors – such as a user’s ethnicity, sexual orientation, and political preferences – are not factored into a company’s determinations when they would be irrelevant or unduly discriminatory.

Automate accountability mechanisms.  Automated accountability mechanisms could monitor data usage and determine whether the uses comply with machine readable policies.

Develop Codes of Conduct.  Self-regulatory codes of conduct will be the most effective means to honor these preferences and others in the rapidly evolving landscape of the Internet of Things.  Codes of conduct could establish frameworks that enable individuals to associate usage preferences with their connected devices.

Provide individuals with reasonable access to personally identifiable information.  This will likely enhance consumer engagement with and support of the Internet of Things.

Latanya Sweeney and Andrea Matwyshyn to Join FTC

Today, the FTC announced that the appointment of Latanya Sweeney to serve as the agency’s Chief Technologist and Andrea Matwyshyn as a Senior Policy Advisor on privacy and data security issues.

Dr. Sweeney is well-known for her hardline criticism of “anonymity” in publicly released datasets. In 1997, Sweeney used seemingly anonymous medical data to demonstrate that she could identify sensitive health information about William Weld, then governor of Massachusetts.

Her studies have been widely celebrated in the privacy community, and her efforts have cast scrutiny on the state of de-identification best practices. As the founder and director of Harvard’s Data Privacy Lab, Dr. Sweeney continues to work to develop better, more complex algorithmic solutions to protect individual privacy. More recently, Sweeney has suggested that Google search results may demonstrate “racial bias in society.”  According to the study, she found that searches of names typically associated with minorities are more likely to generate advertising related to criminal activity.

Dr. Matwyshyn is an assistant professor in Legal Studies and Business Ethics at the Wharton School, where she focuses on technology innovation and its legal implications for corporate information security and consumer privacy.  She’s written widely about “hackers” and their relationship to the law.  Many of her works also focus on machine-human convergence and what sorts of educational efforts and legal rules are needed to encourage technological entrepreneurship.  She will join the FTC’s Office of Policy Planning in December to advise on privacy and data security policy.

Reuters Talks Tracking in Brick-and-Mortar Retail

Reuters today published an article discussing the ways in which brick-and-mortar retailers are using increasingly sophisticated technology to “catch up” to online retail. Along with the benefits — more efficient stores, targeted discounts — the article raises privacy concerns about the tracking of customer behavior in the offline world. We think, when it comes to addressing these concerns, that our Mobile Location Analytics Code of Conduct is a step in the right direction. Notice, transparency, and customer consent have — and will continue to be — important principles as we continue to collaborate with companies to ensure responsible data practices in the retail space.

Link: Big Retailer is watching you: stores seek to match online savvy (via Reuters)

Protecting Privacy and People Using Airbnb to Go on Vacation

Last month, Attorney General Schneiderman made waves when he subpoenaed data on 15,000 New York City-based users of Airbnb, the service best known for allowing people to rent out their spare bedrooms or their homes while on vacation. The Attorney General is seeking to identify local landlords that are using Airbnb’s service to regularly rent vacant apartments as illegal hotels without paying the appropriate taxes. Certainly, Mr. Schneiderman has an obligation to uphold the law, and he is within his rights to crack down on illegal hotels and tax evaders. He also has every right  to require the assistance of Airbnb in that pursuit, but the Attorney General should be mindful of the potential harms his overbroad subpoena may inflict both to Internet commerce and to individual liberty.

At the Future of Privacy Forum, we were surprised by the breadth of Mr. Schneiderman’s request for information. The demand isn’t just for the small number of users who might be abusing the system, but for information revealing the vacation habits of thousands of New Yorkers. The subpoena demands residents’ names and contact information, dates of guest stays, rates charged, and any communications between users and Airbnb about tax issues. That’s a lot of very personal information to be placing into government hands, particularly where, as here, there is no clear evidence of any user wrong-doing.

Airbnb has challenged the subpoena, arguing that it is overbroad and that the Attorney General is basically trying to begin an investigative “fishing expedition.” On Friday, the company received additional support from a number of technology organizations. The Internet Association filed an amicus brief in support of quashing the state’s subpoena. Its brief argues that innovation and technological disruption can place regulators on the defensive, collecting as much information as possible without coming to grips about the state of the law itself. CDT and EFF also filed a brief that highlights both the overreaching nature of the subpoena and the need for courts to carefully review government information requests about large numbers of Internet users.

We agree.  We don’t think that companies get a pass to avoid the law, just because they do business via the internet. But regulators need to understand that investigative efforts that target companies that hold user data need to be appropriately narrowed and targeted.  Wide grabs of consumer data by well-meaning regulators can have a serious impact on consumer privacy. Internet companies, ranging from Airbnb to giants like Google, rely on user trust to deliver their services.  Every day, we exchange copious amounts of personal data in exchange for services because we trust companies to protect our information. Part of that unspoken agreement is that user information will be reasonably protected from unwarranted government intrusions. In the past, when the federal government subpoenaed Amazon.com for information on the reading habits of its customers, the judge criticized the request as “Orwellian.” He wrote, “If word were to spread over the Net – and it would –  that the FBI and the IRS had demanded and received Amazon’s list of customers and their personal purchases, the chilling effect on expressive e-commerce would frost keyboards across America.”

In addition to undermining internet commerce, these types of data requests can also have a chilling effect on individual behavior. Already, people have removed their apartments from Airbnb’s listings for fear of having the Attorney General knock on their door. This hurts not only Airbnb, but also individuals looking for a temporary bed in New York City, a notoriously expensive destination. Of course, protective orders can be put in place to limit Mr. Schneiderman’s use of Airbnb’s data, but breaches and leaks happen and privacy risks remain.

If the Attorney General’s Office truly wants to capture the likely abusers of the Airbnb services, there are any number of categories of information it could seek from Airbnb that might more narrowly target New Yorkers running what are defacto illegal hotels. For example, individuals that use Airbnb as a part of their rental business can make tens of thousands of dollars per year. Mr. Schneiderman could seek information about users who have made more than a certain amount a year from renting their apartments. Or he could look at high-frequency users, subpoenaing information about users who have rented out their apartments an unusually large number of  times in the last year. These categories of information would help the Attorney General zero in on problematic users without invading the privacy of thousands of New Yorkers and chilling an innovative business model that benefits consumers.

The FTC’s Upcoming “Internet Of Things” Workshop: FPF Projects And Resources

Next Tuesday, The Federal Trade Commission will host a workshop on the “Internet of Things,” (IoT), the name commonly used to describe the next generation of connected (or “smart”) devices.  As we enter the age of the Internet of Things, soon our homes will know about our energy consumption habits, our cars will know how we drive, and even our personal fitness will be linked to connected devices.  In this new age, privacy and security will remain a primary concern.  FPF’s Christopher Wolf has written a blog for the IAPP about the event, and will also be participating in one of the day’s panels.  Chris opens with two well-known quotes by Lawrence Lessig and Judge Easterbrook on “The Law of the Horse” to illustrate his point that “with the emergence of new technologies comes the perennial contest over whether the current legal framework provides sufficient protections.”

The FTC’s workshop agenda is now available and includes many great events addressing issues such as those listed above.  Since FPF has worked on a number of projects related to the Internet of Things, we’d like to compile them here in anticipation of the event.

Smart Grid Privacy Seal Program: FPF developed a first-of-its-kind privacy seal program for companies providing services to consumers that rely on energy data.  The program provides flexible rules for homes that access energy use data.  Read more about the project here.

Connected cars: FPF’s Connected Cars Project has convened leading auto manufacturers and telematics companies to discuss best practices in privacy and data security while recognizing the benefits of new connected car technologies.  The FTC IoT workshop will feature a panel with FPF’s Founder and Co-Chair Christopher Wolf speaking on the privacy and security issues raised by connected cars.

Mobile Location Analytics (“MLA”) in Retail Stores: FPF worked with a group of leading Mobile Location Analytics companies and Senator Chuck Schumer to create a Mobile Location Analytics Code of Conduct to provide an enforceable, self-regulatory framework for retail tracking. Learn more about the Smart Store Privacy project here. 

In general, FPF has argued that the key privacy issues presented by the Internet of Things can be adequately addressed by traditional Fair Information Practice Principles.  However, some of the key elements of FIPPs must be tweaked in new ways to accommodate for “smart” technologies.  For instance, it may be more difficult to provide adequate notice to users when certain smart devices lack screens or consumer-facing interfaces.  In addition to the projects listed above, FPF filed public comments to the FTC in preparation for the IoT workshop.  Our comments emphasize the usefulness of flexible self-regulatory codes of conduct, seals and other public-facing and enforceable commitments as tools to safeguard user privacy and security.  We look forward to the FTC’s workshop and hope to see many other privacy-conscious people there.

FPFcast: A Frank Discussion with Frankly CEO Steve Chung

[audio

Should a simple text last forever?  Former FPF Legal & Policy Fellow Heather Federman asked as much in a post in August, and she discussed how  the new text-messaging app, Frankly, works to create impermanent text messages in a world where data is forever.  Subsequently, Frankly has become a new FPF member company, and we reached out to Steve Chung, Frankly CEO, to discuss his app and the future of ephemeral communication.

In this podcast, FPF’s Joseph Jerome talks to Steve Chung about Frankly, the next-gen texting application that promotes safe, anonymous, and private chat rooms.

Click on the media player above to listen, or download the complete podcast here.

Joe Newman

Joe Newman is a legal and policy fellow at the Future of Privacy Forum. He works on a variety of issues involving Do Not Track standards, the US-EU Safe Harbor Agreement and the relationship between privacy and intellectual property. Prior to joining FPF, Joe worked at Public Knowledge, the Library of Congress, The Electronic Frontier Foundation and the Court of Appeals for the Federal Circuit. In addition, Joe currently also works as an Associate for TeachPrivacy, an online training and compliance course run by Professor Daniel J. Solove. He received his J.D. with Honors from the George Washington University Law School, where he received the 2012 Finnegan writing prize, the 2013 Rosenberg Award for intellectual property studies, and the 2013 AIPLEF Jan Jancin award for excellence in intellectual property. He has a B.A. in English and Music from Wesleyan University.