Device Encryption: Too Much Privacy for Consumers?

blog

On December 3rd, FPF and IAPP will be hosting a conversation on device encryption in the wake of Apple and Google’s recent application of “whole device encryption” to their newest devices.What does this mean for consumers? What new protections are added? What impact does this have on hackers or others who may see to access the data on a cell phone? What does it mean for law enforcement?

After an introduction by Jules Polonetsky, Christopher Wolf will moderate a discussion with:

(Additional panelists to be announced.)

Among the issues to be discussed include: Whose role is it to decide whether and how to protect consumer devices, and conversely how much access to provide law enforcement through device manufacturers/service providers? What privacy risks exist with unencrypted devices? If law enforcement can demand passwords or device access from owners consistent with the Fourth and Fifth Amendments, is there a real handicap in investigatory powers?

This event is free and open to the public. Cocktails and food will be provided.  More information and RSVP is available here.

The Connected Car and Privacy: Navigating New Data Issues

The Connected Car and Privacy: Navigating New Data Issues is available to read here.

* * * * * *

Each model year brings cars that are getting smarter and more connected, offering new safety features and consumer conveniences. By the end of the decade, one in five vehicles on the road will be connected to the Internet. But for consumers to welcome these advances, they need to be sure their personal data will be handled in a trustworthy manner, as early research shows that considerable numbers of new car buyers are concerned about data privacy when it comes to car connectivity. To address those concerns, the Alliance of Automobile Manufacturers and the Association of Global Automakers have come together to put forward a set of privacy principles for vehicle technologies and services. These privacy principles set a responsible course for new uses of connected car data and should help avoid any privacy bumps in the road.

The principles cover a wide variety of vehicular data, and they directly address some of the chief privacy concerns raised by new in-car technologies. For example, they cover location information, driver biometrics, and other driver behavioral data, such as seatbelt use or frequency of hard-breaking, that can be gathered by a vehicle, and require opt-in consent by consumers before any of this sensitive information can be used for marketing purposes or otherwise shared with independent third parties. The principles also includea warrant requirement for geolocation information to be shared with law enforcement, absent exigent circumstances or certain statutory authorities. These are important protections, and essential to ensure consumer data is being handled in a trustworthy matter inside the connected car.

The Future of Privacy Forum’s new paper, The Connected Car and Privacy: Navigating New Data Issues, seeks to provide an overview of the various technologies currently available in cars and identifies the types of data collected and the purposes for which it is collected. While connectivity is the buzzword of the day, many of the recent privacy-related headlines about in-car technologies are, in fact, about data collection that is not novel. On-board diagnostic data have been generated by cars for decades, and recording accident-related information on Event Data Records (EDRs) has been going on for years.

Yet connectivity does promise new types of in-car data collection. New sensors and technologies do increase the ability of vehicles to harness location information and in the future, will allow vehicles to collect more information about the car’s immediate surroundings and its driver’s behavior. Today, connected cars frequently provide consumers with more opportunities to take advantage of location-based services in their cars and real-time traffic-based navigation. Similarly, onboard sensors can already be used by vehicles to detect lane markings and immediate obstacles.

In the future, in-car technologies will increasingly gather information about driver behavior or their biometric data. For example, vehicles will be able to quickly identify their drivers, changing car settings to accommodate the driving profile of a teenage or elderly driver. Sensors in the steering wheel or driver’s seat will monitor stress-levels and health conditions. Much of this information is used to drive vehicle safety improvements. Attention assist features evaluates a driver’s steering corrections along with other factors like crosswinds or road surface quality to predict driver fatigue. As they are developed, vehicle-to-vehicle and vehicle-to-infrastructure communications will also augment these features and will depend on responsible privacy standards.

We hope The Connected Car and Privacy provides an introduction to the key technologies used in connected cars and sets out a useful overview of the relevant data flows. We will be looking forward to working with the Alliance of Automobile Manufacturers and the Association of Global Automakers, as well as other stakeholders who deal with these issues, to continue this important conversation.

Amend the U.S. Privacy Act to Provide Further Privacy Protections to European and Other Non-US Persons

I had the pleasure of participating recently at a Georgetown Law Center conference called “Privacy Act @40.”  My panel was on “Looking Ahead,” and my comments focused on new ways that the United States is (and can) extend appropriate privacy rights to citizens of other countries.

Today, just a couple of weeks later, Google has announced that it now favors new legislation in the U.S. to extend the Privacy Act to non-U.S. persons.  This announcement is a valuable step.  Amending the Privacy Act in this way is the right thing to do and would address a longstanding bone of contention in U.S.-E.U. privacy discussions.

The Privacy Act of 1974

As background, the Privacy Act of 1974 is the principle statute that governs how federal agencies handle personal information.  It applies its protections to “U.S. persons” – U.S. citizens and permanent residents in the U.S.  The Privacy Act provides a range of rights to individuals, such as to access and amend their records and a range of other “fair information practices,” such as requirements for transparency, data minimization, accuracy, and data security.

Under the E.U. Data Protection Directive, privacy rights apply in the same way to each individual, regardless of nationality.  I have long worked with European privacy officials, such as in researching a 1998 book on U.S.-E.U. privacy issues and in helping negotiate the Safe Harbor that now governs data exports from the E.U. to our country.  Over and over again, Europeans have said something along these lines: “We provide full privacy rights to U.S. citizens, whenever your data is collected or processed in Europe.  Why won’t the U.S. government treat our citizens the same as yours?”

This European concern about lack of protection for their citizens has become more acute after the Snowden leaks and in the course of serious consideration in the E.U. for updating their comprehensive privacy laws.  Senior E.U. officials this fall have discussed suspending the Safe Harbor agreement, which could cause major interruptions in cross-Atlantic data flows.

The importance of amending the Privacy Act came up during my work on President Obama’s Review Group on Intelligence and Communications Technology.  Our recommendation 14 said: “We recommend that, in the absence of a specific and compelling showing, the US Government should follow the model of the Department of Homeland Security (DHS), and apply the Privacy Act of 1974 in the same way to both US persons and non-US persons.”  The recommendation mentioned how DHS in 2009 issued a Privacy Policy Guidance Memorandum that applies to “mixed systems” of records – systems that collect or use information in an identifiable form and that contain information about both US and non-US persons.  It states: “As a matter of DHS policy, any personally identifiable information (PII) that is collected, used, maintained, and/or disseminated in connection with a mixed system by DHS shall be treated as a System of Records subject to the Privacy Act regardless of whether the information pertains to a US citizen, legal permanent resident, visitor, or alien.”

The Obama administration has not made any official announcement about Recommendation 14, although I believe it remains under consideration.  That Recommendation shows tangible steps that federal agencies can take under current law, following the practice at DHS.  Notably, however, agencies do not have the power to create a private right of action under the Privacy Act.  For that, Congress would need to amend the statute.  Attorney General Holder spoke in favor of such an amendment earlier this year, and Google has now supported that as well.  Based on my experience on this issue with European privacy leaders, including that private right of action would be important to putting this issue to rest.

Presidential Policy Directive 28

Meanwhile, President Obama in January announced what is quite possibly the largest extension in history of privacy protections to non-US persons.  It is worth considering Presidential Policy Directive 28 in some detail, because of the precedent it sets for treating US and non-US persons similarly.

This Directive to federal agencies states: “Privacy and civil liberties shall be integral considerations in the planning of U.S. signals intelligence activities. The United States shall not collect signals intelligence for the purpose of suppressing or burdening criticism or dissent, or for disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion. Signals intelligence shall be collected exclusively where there is a foreign intelligence or counterintelligence purpose to support national and departmental missions and not for any other purposes.”

Notably for equal treatment of non-US persons, PPD-28 states: “Departments and agencies shall apply the term “personal information” in a manner that is consistent for U.S. persons and non-U.S. persons.” PPD-28 goes on to provide that dissemination, retention, and minimization rules should be consistent for US and non-US persons.  There is the possibility of exceptions for national security purposes, but a fair reading of PPD-28 is that it creates a major change in signals intelligence practices.  The rigor of its requirements was reinforced in the Interim Progress Report on Implementing PPD-28, released in October, 2014.  Implementation is due quickly, under a January, 2015 deadline.

In conclusion, PPD-28 and the DHS Privacy Policy Guidance Memorandum show important progress toward addressing concerns that the United States does not apply privacy protections to non-US persons.  The Privacy Act has its weaknesses, as Bob Gellman has recently explained in detail.  But that is no reason to exclude Europeans and other non-US persons from the protections that it does supply.

 

Peter Swire is the Huang Professor of Law and Ethics at the Georgia Tech Scheller College of Business. While at the U.S. Office of Management and Budget in 1999-2001, Swire was responsible for coordinating agency compliance with the Privacy Act of 1974.

Public Perceptions on Privacy

Today’s new report by the Pew Research Center gives the lie to the notion that privacy is unimportant to the average American. Instead, the big take away is that individuals feel like they lack any control over their personal information. These feelings are directed at the public and private sector alike, and suggest a profound trust gap is emerging in the age of big data.

While Pew has framed its report as a survey of Americans’ attitudes post-Snowden, the report presents a number of alarming statistics of which businesses ought take note. Advertisers take the brunt of criticism, and the entire report broadly suggests that public concerns about data brokers and the opacity of data collection are only growing. Seventy-one percent of respondents say that advertisers can be trusted only some of the time, and 16% say they never can. These numbers track every demographic group, and indeed, get worse among lower income households. Eighty percent of social network users are concerned about the information being shared with unknown third parties. Even as Americans are concerned about government access to personal information, they increasingly support more regulation of advertisers. This support is strong across an array of demographic groups.

Further, even as consumers remain willing to trade personal information in return for access to free Internet services, two-thirds of consumers disapprove of the suggestion that online services work due to increased access to personal information. More problematic, however, is that 91% of Americans now believe that “consumers have lost control over how personal information is collected and used by companies.” Though this Pew study does not show that privacy values are trumping digital services — and every indication suggests that they are not — it is a likely topic for Pew to return to in the future. It would be interesting to see whether this anxiety translates into action.

However, in the meantime, anxiety about privacy suggests an opportunity for companies to win with consumers simply by providing them with more control. Fully 61% “would like to do more” to protect their online privacy. We have repeatedly called for efforts to “featurize” data and have supported efforts to help consumers engage with their personal information. Many companies already provide meaningful controls on the collection and use of personal information, but the challenge is both making consumers aware of these options and ensuring that taking advantage of these dashboards and toggles is as fun as using a simple app.

So we need more tools to make privacy fun. And industry may also need to a better job staying attuned to consumer preferences. Pew reiterates how context-dependent privacy is, and that the value of privacy and consumer interest in protecting their privacy can vary widely from person to person, in different contexts and transactions, and perhaps most pointedly, in response to current events. “[U]sers bounce back and forth between different levels of disclosure depending on the context,” the report argues.

The challenge is ensuring that context is understood similarly by all parties. Part of this is understanding where and when personal information is sensitive. This is a debate that was highlighted at the FTC’s recent big data workshop, and is a theme that increasingly arises in conversations about big data and civil rights. Aside from Social Security numbers, which 95% of respondents considered to be sensitive information, data ranging from health information and phone and email message content to location information and birth date could be viewed as sensitive depending upon the context.

Depending upon context, everything is sensitive or nothing is sensitive. Obviously, this can be a tricky balancing act for consumers to manage. Information management requires users to juggle different online personas, platforms, and audiences. Thus, the door is open for companies to both take certain information off the table — or make a better case why some sensitive information is invaluable for certain services.

While Pew has not shown whether these privacy anxieties trump other pressing economic or social concerns, the report also suggests that the Americans’ perceptions of privacy are heavily intertwined with their understanding of security. Privacy may be amorphous, but security is less so — but being proactive on the one can often be a boon to the other. Positive and proactive public actions on privacy are essential if we are to reverse Americans’ doubts that they can trust sharing their personal information.

-Joseph Jerome, Policy Counsel

Debating the FBI on Phone Encryption

FBI Director James Comey has heated up the encryption debate with his recent appearances on Sixty Minutes and at the Brookings Institution.  Comey has sharply criticized Apple and Google for the companies’ announcements that they would enable strong encryption on their phones.  In contrast to prior practice, the companies would no longer keep a key to gain access to the encrypted content.  I applaud the companies’ announcements, which among other virtues will strengthen cybersecurity.

On November 17th, I have been invited to debate this issue at the New America Foundation, from 4:00 to 5:30 p.m., with webcast planned.  Nancy Libin, formerly of both the Center for Democracy and Technology and the U.S. Department of Justice, will be the moderator.  The opposing perspective will be offered by Andrew Weissmann, who until 2013 was General Counsel at the FBI.  I believe this will be the highest-profile live debate on the issue since Comey began his statements.

To remind us of the issues at stake, this post highlights four items I have previously worked on about encryption and global communications policy, the first three of which were supported by the Future of Privacy Forum’s project on government access to data in 2011-2013.

First, and perhaps most readably, is “Going Dark vs. the Golden Age of Surveillance.” (2011). This piece challenges the FBI’s claims that it is “going dark” due to encryption and other changes in communications technology.  Instead, Kenesa Ahmad and I argue that a better image would be that we are in “a golden age of surveillance.”  Compared with earlier periods, surveillance capabilities have greatly expanded.  Government agencies have unprecedented access to location information now that we all carry cellphones.  Information about contacts, confederates, and conspirators has massively expanded, as all of our texts, emails, and social network postings are saved by communications carriers.  In addition, there are myriad new databases that create digital dossiers about our lives.  In short, if government agencies were offered the choice of current capabilities or pre-Internet capabilities, they would overwhelmingly prefer their surveillance abilities today.  This piece was written before the Snowden leaks, so the idea of law enforcement and intelligence agencies “going dark” is even less plausible today.

Second, this going dark discussion was part of a larger research project on “Encryption and Globalization” (2012).  This lengthy article provides background on a variety of encryption issues, including developments in India and China.  One claim of the article is that strong encryption is even more vital in today’s globalized world than during the crypto wars of the 1990’s.  A second claim concerns what we call “the least trusted country problem.”  If there are backdoors or limits on effective encryption, then the security of global communications is only as strong as the security in the least trusted country.  Other countries will demand the same backdoors available to the U.S. government.  When the FBI or other agencies argue for weak security, we should consider the effects of surveillance by these other countries, many of whom lack the legal safeguards in the United States.

Third is my 2012 article “From Real-Time Intercepts to Stored Records: Why Encryption Drives the Government to Seek Access to the Cloud.”  This paper, as the title suggests, explains how changing technology is pushing government agencies to go to cloud providers for law enforcement and intelligence purposes.  Relevant to the Comey debate, and as explained in detail by Chris Soghoian, cloud providers hold an enormous wealth of potential evidence.  Even if police have difficulty getting into a smartphone, the relevant evidence very often is available from the cloud provider.

Fourth is the discussion of encryption in the report of President Obama’s Review Group on Intelligence and Communications Technology, for which I was one of five members.  The report in general, and our Recommendation 29 in particular, emphasizes why U.S. government policy should strongly encourage the use of effective encryption.

Andrew Weissmann, in addition to his role at the FBI, is an experienced litigator and former chief of the Enron Task Force.  He is a Senior Fellow at the NYU School of Law and its Center for Law and Security.  I look forward to a vigorous debate.

Peter Swire is Senior Fellow at the Future of Privacy Forum and the Huang Professor of Law and Ethics at Georgia Tech Scheller College of Business.

Student Privacy Pledge – Learn More!

SIIA/FPF Webinar: Responding to Student Privacy Concerns

Attention: open in a new window. PDFPrintE-mail

Does your company collect or maintain personal student data?  Are you looking for a set of guidelines you can adopt to demonstrate your adherence to data privacy best practices?

On October 7, more than a dozen leading K-12 school service providers announced their signing of a pledge to advance student data privacy protection. Since then, two dozen additional organizations have signed the pledge introduced by the Software & Information Industry Association (SIIA) and the Future of Privacy Forum (FPF).

Why is it essential for your organization to also take the pledge?  Register today for SIIA’s November 17 webinar, directed to school service providers, to learn background and contextual information, understand specific pledge commitments, garner perspectives of signatory companies and education leaders, and get your questions answered about the pledge and the signature process.

Presenters include:

  • Inna Barmash, Associate General Counsel, Amplify
  • Jules Polonetsky, Executive Director and Co-chair, Future of Privacy Forum
  • Mark Schneiderman, Senior Director of Education Policy, Software & Information Industry Association (SIIA)
  • Donna Williamson, CTO, Mountainbrook Schools, AL

 

Date: November 17, 2014

Time: 2pm – 3pm EST

Stat Date and Time: 2014-11-17 14:00 EST

End Date & Time: 2014-11-17 15:00 EST

Event Price: Free

Register at the link below:

http://siia.net/index.php?option=com_content&view=article&id=1885:responding-to-privacy-concerns&catid=27:education-overview&Itemid=1930

Cameron Kerry Queries Whether Law Enforcement Is Really "Going Dark"

Writing in Forbes today, Cam Kerry, formerly of the Department of Commerce and a member of the FPF Advisory Board, discusses some of the challenges facing law enforcement as technology continues to race past the law. In recent weeks, FBI Director James Comey has criticized tech companies like Apple and Google for embracing stronger levels of encryption, encryption that increasingly hampers the ability of law enforcement to get access to information.

Kerry notes that not only did last year’s Snowden revelations made “it hard to argue the U.S. government lacks visibility into communications,” but he also recognizes a fundamental tension between the needs of law enforcement and technological innovation. “[Law enforcement’s] main mission to catch the bad guys constrains the airing of civil liberties and privacy issues that matter to Internet users, providers and others,” he writes. “[F]rom a technical standpoint, the FBI’s front door is a hacker’s or spy’s back door.”

There’s the rub: any lawful intercept solution will be exploited by third parties. Without a basic recognition of this fact, the honest debate the FBI Director is seeking may be difficult to properly frame.

Jules Polonetsky Statement Following Home Depot Announcement

Today, The Home Depot released new findings from its investigation of the company’s recent payment data breach. Jules Polonetsky, Executive Director of the Future of Privacy Forum, had the following statement:

More important than legal compliance after a breach is a company’s efforts to make sure that consumer concerns are addressed. It’s great to see The Home Depot take this extra step of notifying individuals whose email addresses were located in files apparently taken during a previously-reported payment breach. Since passwords or other protected account information wasn’t affected, there is no legal obligation for the company to disclose that email addresses have been taken, but clearly consumers affected will benefit from The Home Depot’s consumer outreach and can be on guard against suspicious emails.

"Big Data: Putting Heat on the Hate" by Chris Wolf and Jules Polonetsky



Today, Re/code ran an essay by Chris Wolf and Jules Polonetsky, marking the five year signing of the Hate Crimes Prevention Act. The two discussed big data’s ability to put the “heat on hate,” concluding that while “[t]hese are still the early days of development for big data and civil rights . . . it is becoming ever more clear that big data’s value for empowering groups and fighting discrimination is incumbent upon its continued application to novel endeavors.”



Android 5.0, Lollipop: Major New Privacy Features

 

Earlier this month, Google announced the final release of Android 5.0 Lollipop, also known as Android L. Lollipop includes a number of valuable new privacy features worth special applause.

Default Encryption

New phones and tablets with Lollipop come with encryption automatically turned on to help protect data on lost or stolen devices or from anyone who lacks password access.   Android devices running Jelly Bean and Kit Kat operating systems have had the capability of encrypting data, but this feature had to be expressly activated by users.  Now, all users will automatically benefit from the added protection.

Guest Mode

Lollipop allows others to use your device in a restricted guest mode where they don’t have access to your personal data. In the guest user mode, first time guest users will see a newly-installed system with only the stock apps. Guest users can then sync their contacts, email, and photos from their Google account or install apps from the Play Store.

Screen Pinning

Have you ever shown someone a photo, and then watched in a panic as they kept swiping to see other photos on your phone? Screen pinning comes in handy when you have a friend or family member who you want to show something on your phone without letting them see other sensitive information. Screen pinning locks the screen to the content you wish to display – a photo or a video or a specific text screen – until you put in your password.

Android Smart Lock

If you’re tired of entering your pin or swipe pattern every time you want to use your phone, Smart Lock may be a feature you want to use. You can program any Bluetooth device to be a trusted device. When your phone is within range of a trusted device, the phone will allow unlock and allow you to simply swipe to use. If the phone moves farther away from the trusted device, the phone will go back to being locked and you will need to enter their pin or swipe pattern. The cool thing about this is that trusted devices do not need to be stationary; users can pair their phones with a wearable or even a car.  For example, users can have their phone stay unlocked when near their Fitbit or laptop and can be sure that if they walk away from it, the phone will automatically lock.

New Notification Settings

With Lollipop, you will now be able to control the visibility of notifications when the device is locked. You can set notifications from particular friends as ‘sensitive’ or ‘blocked’ to control their visibility. You can also turn on a Priority mode, which acts like a Do Not Disturb sign and allows the phone to only show the most important notifications.

BLE Mac-Address Rotation

Bluetooth MAC addresses broadcast by mobile devices are one of the identifiers often tracked by retailers or other venues in order to create location analytics reports.  See FPF’s Smart-Places privacy code for how this works and for opt-out options.  In Lollipop, Bluetooth MAC addresses will  now rotate when the device is scanning (central mode) and advertising (peripheral mode), making it harder for your phone to be tracked via its Bluetooth MAC address.

WiFi SSID Suppression

Many phones broadcast the recent SSIDs of routers to which they have connected, in order to make reconnecting easier.  If the last router your phone connected with was your home router, perhaps named “Smith Family home”, this information can be detected by the next WifI network you join. The new WiFi SSID suppression in Lollipop eliminates the broadcasting of recently connected routers (except for hidden SSIDs, which need to be broadcast for automatic connection).

Encryption by Default is a Very Big Deal

Turning on encryption by default is a major step forward for users and will do a great deal to ensure that the sensitive information on users phones can’t be accessed without their permission.  Many of these other features also add user friendly options that should make day to day use of Android devices more convenient, more private and more secure.