Travis LeBlanc on the FCC's New Privacy Role

At today’s FCBA brown bag lunch, FCC Enforcement Bureau Chief Travis LeBlanc discussed the Commission’s recent entrance into privacy enforcement and fielded questions as to what companies might do to avoid running afoul of the Enforcement Bureau. LeBlanc emphasized the innovation continues to outpace regulators, noting that much of the Commission’s investigative and enforcement work is a five to seven year process. “We’re at the point where we’d be having the Supreme Court judge [problems] with first-generation smartphones,” he mused. He highlighted the Commission’s recent decision to join the Global Privacy Enforcement Network as an effort to help keep pace with change in technology.

Kelley Dyre’s John Heitmann pressed LeBlanc on the FCC’s notices of apparent liability (NALs) against TerraCom and YourTel, which he suggested interpreted Sections 222(a) and 201(b) of the Telecommunications Act in novel ways to protect consumer privacy. Section 222(a) states that “[e]very telecommunications carrier has a duty to protect the confidentiality of proprietary information of, and relating to …customers.” While this has long been the basis for the FCC’s security rules around CPNI, but LeBlanc argued that Section 222 does not limit the duty of carriers to protecting only CPNI. He admitted that for “folks in the industry, in the media, and in the privacy community, there was an ‘uh huh, interesting’ moment” regarding the Commission’s interpretation, but he suggested this interpretation has been used to support other privacy work within the FCC, “if not squarely in the enforcement context.” He argued that Section 222’s protection of proprietary information was designed “to encompass the protection of information customers intended to keep private, which includes PII” and is more than just CPNI as defined by the FCC. “Going forward, fair to say, that’s the concept we’ll be using in our work,” LeBlanc stated.

LeBlanc also explained that Section 201(b), which prohibits carriers from engaging in unjust and unreasonable business practices, must be viewed as being co-extensive with Section 5 of the FTC Act. “It’s a basic consumer protection tool that we use to ensure carriers can’t engage in unjust practices,” he said, citing a recent settlement against AT&T for “cramming” extra charges onto consumer bills as an example of how to apply Section 201. He explained that the application of this interpretation within the context of policing privacy practices is “an iteration of that view and not a transformation.” Echoing the FTC’s actions on privacy policies, LeBlanc emphasized that the FCC hoped “to marry [company’s] language with their practices.” He added that the cramming settlement shows that the FCC is focused on conduct that directly harms consumers. The Enforcement Bureau, he suggested, was not interested in technical rules violations where no one was harmed or impacted. He also suggested it was important to differentiate between breaches of personal information, such as credit cards, that can be remedied and those that cannot such as Social Security number breaches. “In that circumstance, [a person’s] identity may be stolen or it may not, but no one’s going to re-issue you a Social Security number.”

LeBlanc spoke at length about the differences between the FCC and the Federal Trade Commission, the nation’s primary privacy cop. “We’re a regulatory agency with rule-making authority in contrast to the FTC, which is a primarily a policing agency,” he explained. “The benefit of having a law enforcement unit in the same angry as the one making the rules [is that] we can go talk to them before we do an enforcement action. If we’re going to do anything, we need to pick up the phone first. . . . It is impossible for anyone writing laws or rules to anticipate every circumstance out there you intend to bar, so you leave some part of it ambiguous. That’s an advantage over doing enforcement independently. There are risks that an enforcer could exploit a small error in the language of a statute.” He suggested housing both rule-making and enforcement in one entity improves effectiveness and efficiency.

The ramifications of the Commission’s recent $7.4 million settlement against Verizon for its past failure to notify consumers of their opportunity to opt-out of marketing using CPNI information were also a key topic of discussion. LeBlanc suggested the more interesting parts of the settlement were its non-financial terms. He applauded Verizon’s decision to include a notice of consumer opt-out rights in every monthly bill going forward. He suggested more notices like this give consumers the ability to evaluate (and rethink) their decisions to share information. He also suggested that CPNI rules move away from unclear “reasonable standards” and place stronger protections on customer’s proprietary information.

LeBlanc also reiterated his desire to see companies admit to wrong-doing in settlement actions. He suggested that negotiations with Verizon were already on-going at the time the Enforcement Bureau announced a practice of seeking admissions of liability or facts in settlements. Explaining that FCC settlements were designed to provide guidance to others engaging in similar conduct, “the only way to effectively do that is to provide some detail into what a company did that was wrong.” He was also dismissive of notions that admissions-of-wrongdoing would impede the ability of companies either to retain business or gain government contracts. “I don’t think that’s true,” he said, suggesting settlements could be narrowly worded enough to protect companies from that sort of sanction.

Turning to emerging privacy issues, LeBlanc emphasized that he hoped to prevent industry mistakes rather than to respond after the fact. “Where I can provide guidance to the industry to operate in compliance with the law, I’d like to do that,” he said. His chief recommendation was for companies to do better with their privacy policies. He admitted that the lack of baseline federal privacy law forced him, as well as other agencies, to “work on the representations industry makes,” pointing to existing FTC practice. He suggested that the SEC will be interested in this moving forward, as well.

“We understand that sometimes companies are victims,” he said. “They are targets — no pun intended.”  He pointed to some of the “mitigating practices” companies could pursue in the event of breaches, including (1) notifications when information was compromised, (2) credit monitoring services, and (3) providing hotlines or websites to consumers. He also highlighted the importance of chief privacy officers, training, and the adoption of industry best practices and security audits. That said, he also appeared skeptical of some common “excuses” for breaches such as (1) errant employees, (2) technological glitches, and (3) contractor practices. “The company that collects personal information from the consumer, that has that relationship with the consumer, is responsible for protecting it [downstream],” he said. “That duty cannot be out-sourced.”

Finally, Heitmann could not avoid asking LeBlanc whether all of his comments might apply to broadband services in the event the FCC reclassifies broadband under Title II. “Wouldn’t you like to know?” LeBlanc laughed. “I cannot speculate on what the Commission is going to do in this context . . . We will stand ready and prepared to meet the Commission’s goals.”

Understanding Beacons: A Guide to Bluetooth Technologies

Local Search Association and Future of Privacy Forum release a simple and concise primer that explains how the Bluetooth devices work and how privacy friendly controls ensure user control.

As competition for fickle and frugal holiday shoppers kicks into high gear, traditional retailers are seeking new ways to bring consumers into stores and provide them with improved shopping experiences. Leveraging near ubiquitous smartphone adoption, Bluetooth beacons have emerged as one of the more popular tools in this quest.

While beacons have many non-commercial uses, the US retail industry is where much of the early beacon adoption has come. And though they’re just one of several indoor location technologies, beacons have emerged as the leader because of their low cost and relatively simple deployment.

“Indoor location and beacons have a very broad array of potential applications,” said Greg Sterling, VP of Strategy and Insights for the Local Search Association (LSA). “Through mobile apps, they can help deliver content, promotions or enhanced information in real-world contexts such as stores, airports and hotels.”

The novelty and excitement surrounding beacon technology has generated considerable media attention. Yet beacons are generally not well understood. The LSA and Future of Privacy Forum (FPF) created “Understanding Beacons: A Guide to Bluetooth Technologies” to address some of this confusion and the many misperceptions about how beacons operate.

“Beacons are a privacy friendly technology because apps that interact with beacons are controlled by users,” explained FPF Executive Director Jules Polonetsky. “The settings on leading mobile operating systems ensure that users opt-in before beacons can be used and before users can be contacted.”

The six-page guide straightforwardly explains how beacons work and provides examples of current use cases in the market. It also clarifies and dispels common misunderstandings about beacons and consumer privacy.

Understanding Beacons explores the following questions:

For those unfamiliar with beacons, their capabilities and technical limitations, Understanding Beacons will provide a very useful overview and introduction. The document is free and available here.

Download: “Understanding Beacons: A Guide to Bluetooth Technologies”

Discussing the Merits of Device Encryption

In the wake of Apple and Google’s recent decision to implement “whole device encryption” on their latest mobile operating systems, the FBI has warned that the tech giants’ actions will force law enforcement to “go dark” when it comes to keeping tabs of criminals. FPF has previously explored the question of encryption and law enforcement access, and encourages efforts by tech companies to make their devices and services more secure.

In the wake of Snowden’s revelations about government surveillance last year, there has been a renewed conversation about whether communications technology is sufficiently secure. At minimum, encryption helps to protect users against unauthorized access to their personal information. The question now facing policymakers is whether improvements in technical security must be sacrificed to enable lawful government access.

Kicking off a conversation on the merits of device encryption, Chris Wolf wondered whether today’s debate was simply a repeat of the crypto wars of the 1990s, or whether a new security balance ought to be struck. Wolf discussed that and more with Georgetown Law’s Carrie Cordero, Amie Stepanovich from Access Now, and Cato’s Julian Sanchez, who stepped away from planning a full-day symposium on the larger issue of government surveillance.

A Renewed Conversation about “Going Dark”

Cordero noted that the concept of “going dark” is nothing new, but stressed that there were significant differences between how the debate was waged in the 1990s versus today. Whereas previously the FBI was concerned about the ability to engage in real-time surveillance, it now has very real concerns about its ability to lawfully obtain stored information. This has changed since Snowden the aggressive implementation of encryption and other technologies by tech companies.

“Why are we talking about encryption now?” Stepanovich mused. “Computers have had default encryption on hard drives for many years without anyone raising an eyebrow, but now because it’s on a phone it’s different?” She argued that the current debate is inexorably tied to concerns about surveillance in the wake of the Snowden revelations. “The conversation we’re having isn’t because governments were going after bad actors, but because they were going after everybody. [We now know] how robust the efforts are to get access to your data when access can be gotten. If there is any vulnerable point . . . somebody is probably going to break in and get the data,” she stated. “[Encryption] comes from an abuse of gathering information.”

Wolf pushed back, asking whether such a decision ought to be made as a matter of public policy and not by device manufacturers. Stepanovich countered by suggesting one take a larger view: “These devices are sold around the world. If we start looking at the risk to the user worldwide, it becomes unacceptable . . . not to offer the most security they can offer.” Encryption should be viewed not as an unnecessary obstruction, but rather as an additional protection from unauthorized access to personal information.

However, Cordero cautioned against abandoning efforts to work on technical solutions to protect users against bad actors and allow compliance with law enforcement. She stressed that there remained a societal interest in preserving the capacity of law enforcement to serve lawful process to investigate crimes and national security threats. “What the government is talking about now is the ability to serve a court order,” she said.

What’s the Honest Impact?

Sanchez was skeptical of government’s ability to calculate how encryption actually impacts law enforcement. “We’ve been ‘going dark’ for a long time according to the government,” he stated. He highlighted lots of different ways that law enforcement can gain access to information without physically accessing a mobile device, and suggested that it was quite possible for an individual to be held in contempt of court and jailed for refusing to unlock an encrypted phone. While all conceded the Fifth Amendment protections against self-incrimination are murky at best when it comes to being compelled to unlock an encrypted device, Cordero cautioned that holding individuals in contempt was not a useful mechanism when time is of the essence. “Contempt proceedings aren’t going to be particularly satisfying for law enforcement,” she explained.

“We basically need magic,” Sanchez responded, critiquing the government’s position. He cautioned against treating tech companies like “magicians” and highlighted The Washington Post editorial board’s recent call for “golden keys” that would only work for law enforcement. Technical experts and security researchers largely agree that implementing any sort of hidden access feature also introduces exploitable vulnerabilities, he explained.

He also made the point that Apple’s “soup-to-nuts” business model, with its walled gardens and closed systems, is largely unique. “A general premise in computing is that someone will sell you a computer that comes pre-installed with things like Windows, and you could install other software like Linux,” he explained. “That’s an important value that’s given rise to a tremendous amount of innovation.” Comparing Apple’s mobile device business model to Android’s, which is largely open-source, Sanchez explained that the government’s position effectively wages a war on open-computing. “It’s not possible to force people to keep a backdoor they don’t want, or any attempt would be extraordinarily destructive,” he explained.

Looking Forward on Device Encryption

Wolf asked each panelist to preview where the conversation would be a year from now. Sanchez flippantly suggested public discourse would continue to be filled with “hypotheticals cribbed from The Blacklist.” Stepanovich noted that this debate has been ongoing in some form for decades, and we will likely be in the exact same place a year from now. She argued the only positive change could come from revisiting the logic behind the Communications Assistance for Law Enforcement Act (CALEA). She suggested that privacy advocates were largely playing defense rather than offense. “We need to put a law on the books [that states] government cannot force companies to put in a backdoor that makes users less secure,” she stated.

Cordero offered a different perspective. “If law enforcement is serious about pursuing this issue, they’re going to have to make the case.” Noting that many of the FBI’s most recent anecdotal examples of “going dark” have been debunked, she suggested the law enforcement needs to develop a more comprehensive factual record. “In the 1990s, the FBI presented a range of statistics and data that demonstrated factually that there was a situation requiring legislation. As well as GAO reports and independent studies. We need additional facts.”

At its core, she continued, this debate is the same argument as was against CALEA in 1994. “We made a judgment then [that forcing companies to comply with law enforcement] was a valid purpose,” she explained. If companies are no longer required to preserve that capability in the future, it will become costly for government to adapt as technology rapidly evolved.

Sanchez disagreed with comparisons to CALEA. He explained that CALEA applied to a small number of telecoms with centralized hubs, and there is a huge difference between what CALEA accomplished and what is being proposed now. “What we’re talking about now is forcing an architecture used by hundreds of millions of consumers that would preclude devices from running arbitrary code,” he argued.

Stepanovich returned to Cordero’s point that device encryption could prove costly to law enforcement. She noted that “tech has trended the other way.” Instead, technology has largely decreased the cost of government surveillance (which FPF Senior Fellow Peter Swire has also explained as leading to a “golden age of surveillance”). “Things like encryption counter that dip in price by forcing law enforcement to invest in more targeted surveillance,” Stepanovich said, which should be encouraged.

A Big Policy Choice: To Kill Encryption of Not?

Encryption, Stepanovich concluded, “gives users the ability to control their own data and gives them an option.” Highlighting was has been called “the least trusted country problem,” the costs of encryption must also be weighed against the effects of surveillance in other countries, which lack the legal safeguards of the United States.

Tech companies are responding to market pressures to do more to secure information, and additional encryption options are the result. The panel largely agreed that law enforcement still has alternative ways of accessing most of the information being encrypted on a device. “No body wants perfect encryption,” Sanchez concluded. “We forget our complicated pass phrases, and then everything is irretrievably lost.”

More discussion on the matter is clearly needed. As Cordero explained, “Law enforcement and national security may continue to stress this issue.” However, she also acknowledged that the issue may well be “politically impossible” to address.

Barclays Launches Beacons to Help Disabled Customers

Barclays just launched a beacon technology system in a UK branch to help disabled customers with their accessibility needs. The service, which requires customers to download an app and opt-in, notifies the Barclays staff if a customer with disabilities enters the branch. This way, the staff can provide quicker and more tailored services to any customers with disability needs. The customer also does not have to tell the staff about his or her individual needs every time he or she visits the branch.

Customers with disability can choose to opt-in to the service by downloading an app and registering their information. These customers can enter information about their accessibility needs and even upload a photo of themselves. Once the app senses the beacon, the app will send a notification to a staff member in the branch, which alerts Barclay staff that a customer with accessibility needs is entering the building. Barclays’ Director of Accessibility and Inclusion noted that beacons are an innovative way to address issues that people with disabilities face when entering bank branches.

Right now, the system is being tested with Apple iOS in the Barclays Sheffields branch, but could expand to other branches and operating systems if successful.

This great opt-in use of beacon technology to provide quick, tailored accessibility services to customers with disability needs is just one example of the many ways beacons are being used to provide value to mobile device users in a privacy friendly manner.

By Stephany Fan | Image Source

Getting Privacy Policies Right the First Time

We have all seen too many well-meaning companies have to face up to what their privacy policy really said, rather than what they intended or even did.  Here are some tips to help prevent that, for new companies, or those in the process of updating their policies – before getting their 15 minutes of unwanted fame.  More tips available at FERPA|SHERPA (studentprivacycompass.org).

https://www.edsurge.com/n/2014-12-01-getting-privacy-policies-right-the-first-time

Device Encryption: Too Much Privacy for Consumers?

blog

On December 3rd, FPF and IAPP will be hosting a conversation on device encryption in the wake of Apple and Google’s recent application of “whole device encryption” to their newest devices.What does this mean for consumers? What new protections are added? What impact does this have on hackers or others who may see to access the data on a cell phone? What does it mean for law enforcement?

After an introduction by Jules Polonetsky, Christopher Wolf will moderate a discussion with:

(Additional panelists to be announced.)

Among the issues to be discussed include: Whose role is it to decide whether and how to protect consumer devices, and conversely how much access to provide law enforcement through device manufacturers/service providers? What privacy risks exist with unencrypted devices? If law enforcement can demand passwords or device access from owners consistent with the Fourth and Fifth Amendments, is there a real handicap in investigatory powers?

This event is free and open to the public. Cocktails and food will be provided.  More information and RSVP is available here.

The Connected Car and Privacy: Navigating New Data Issues

The Connected Car and Privacy: Navigating New Data Issues is available to read here.

* * * * * *

Each model year brings cars that are getting smarter and more connected, offering new safety features and consumer conveniences. By the end of the decade, one in five vehicles on the road will be connected to the Internet. But for consumers to welcome these advances, they need to be sure their personal data will be handled in a trustworthy manner, as early research shows that considerable numbers of new car buyers are concerned about data privacy when it comes to car connectivity. To address those concerns, the Alliance of Automobile Manufacturers and the Association of Global Automakers have come together to put forward a set of privacy principles for vehicle technologies and services. These privacy principles set a responsible course for new uses of connected car data and should help avoid any privacy bumps in the road.

The principles cover a wide variety of vehicular data, and they directly address some of the chief privacy concerns raised by new in-car technologies. For example, they cover location information, driver biometrics, and other driver behavioral data, such as seatbelt use or frequency of hard-breaking, that can be gathered by a vehicle, and require opt-in consent by consumers before any of this sensitive information can be used for marketing purposes or otherwise shared with independent third parties. The principles also includea warrant requirement for geolocation information to be shared with law enforcement, absent exigent circumstances or certain statutory authorities. These are important protections, and essential to ensure consumer data is being handled in a trustworthy matter inside the connected car.

The Future of Privacy Forum’s new paper, The Connected Car and Privacy: Navigating New Data Issues, seeks to provide an overview of the various technologies currently available in cars and identifies the types of data collected and the purposes for which it is collected. While connectivity is the buzzword of the day, many of the recent privacy-related headlines about in-car technologies are, in fact, about data collection that is not novel. On-board diagnostic data have been generated by cars for decades, and recording accident-related information on Event Data Records (EDRs) has been going on for years.

Yet connectivity does promise new types of in-car data collection. New sensors and technologies do increase the ability of vehicles to harness location information and in the future, will allow vehicles to collect more information about the car’s immediate surroundings and its driver’s behavior. Today, connected cars frequently provide consumers with more opportunities to take advantage of location-based services in their cars and real-time traffic-based navigation. Similarly, onboard sensors can already be used by vehicles to detect lane markings and immediate obstacles.

In the future, in-car technologies will increasingly gather information about driver behavior or their biometric data. For example, vehicles will be able to quickly identify their drivers, changing car settings to accommodate the driving profile of a teenage or elderly driver. Sensors in the steering wheel or driver’s seat will monitor stress-levels and health conditions. Much of this information is used to drive vehicle safety improvements. Attention assist features evaluates a driver’s steering corrections along with other factors like crosswinds or road surface quality to predict driver fatigue. As they are developed, vehicle-to-vehicle and vehicle-to-infrastructure communications will also augment these features and will depend on responsible privacy standards.

We hope The Connected Car and Privacy provides an introduction to the key technologies used in connected cars and sets out a useful overview of the relevant data flows. We will be looking forward to working with the Alliance of Automobile Manufacturers and the Association of Global Automakers, as well as other stakeholders who deal with these issues, to continue this important conversation.

Amend the U.S. Privacy Act to Provide Further Privacy Protections to European and Other Non-US Persons

I had the pleasure of participating recently at a Georgetown Law Center conference called “Privacy Act @40.”  My panel was on “Looking Ahead,” and my comments focused on new ways that the United States is (and can) extend appropriate privacy rights to citizens of other countries.

Today, just a couple of weeks later, Google has announced that it now favors new legislation in the U.S. to extend the Privacy Act to non-U.S. persons.  This announcement is a valuable step.  Amending the Privacy Act in this way is the right thing to do and would address a longstanding bone of contention in U.S.-E.U. privacy discussions.

The Privacy Act of 1974

As background, the Privacy Act of 1974 is the principle statute that governs how federal agencies handle personal information.  It applies its protections to “U.S. persons” – U.S. citizens and permanent residents in the U.S.  The Privacy Act provides a range of rights to individuals, such as to access and amend their records and a range of other “fair information practices,” such as requirements for transparency, data minimization, accuracy, and data security.

Under the E.U. Data Protection Directive, privacy rights apply in the same way to each individual, regardless of nationality.  I have long worked with European privacy officials, such as in researching a 1998 book on U.S.-E.U. privacy issues and in helping negotiate the Safe Harbor that now governs data exports from the E.U. to our country.  Over and over again, Europeans have said something along these lines: “We provide full privacy rights to U.S. citizens, whenever your data is collected or processed in Europe.  Why won’t the U.S. government treat our citizens the same as yours?”

This European concern about lack of protection for their citizens has become more acute after the Snowden leaks and in the course of serious consideration in the E.U. for updating their comprehensive privacy laws.  Senior E.U. officials this fall have discussed suspending the Safe Harbor agreement, which could cause major interruptions in cross-Atlantic data flows.

The importance of amending the Privacy Act came up during my work on President Obama’s Review Group on Intelligence and Communications Technology.  Our recommendation 14 said: “We recommend that, in the absence of a specific and compelling showing, the US Government should follow the model of the Department of Homeland Security (DHS), and apply the Privacy Act of 1974 in the same way to both US persons and non-US persons.”  The recommendation mentioned how DHS in 2009 issued a Privacy Policy Guidance Memorandum that applies to “mixed systems” of records – systems that collect or use information in an identifiable form and that contain information about both US and non-US persons.  It states: “As a matter of DHS policy, any personally identifiable information (PII) that is collected, used, maintained, and/or disseminated in connection with a mixed system by DHS shall be treated as a System of Records subject to the Privacy Act regardless of whether the information pertains to a US citizen, legal permanent resident, visitor, or alien.”

The Obama administration has not made any official announcement about Recommendation 14, although I believe it remains under consideration.  That Recommendation shows tangible steps that federal agencies can take under current law, following the practice at DHS.  Notably, however, agencies do not have the power to create a private right of action under the Privacy Act.  For that, Congress would need to amend the statute.  Attorney General Holder spoke in favor of such an amendment earlier this year, and Google has now supported that as well.  Based on my experience on this issue with European privacy leaders, including that private right of action would be important to putting this issue to rest.

Presidential Policy Directive 28

Meanwhile, President Obama in January announced what is quite possibly the largest extension in history of privacy protections to non-US persons.  It is worth considering Presidential Policy Directive 28 in some detail, because of the precedent it sets for treating US and non-US persons similarly.

This Directive to federal agencies states: “Privacy and civil liberties shall be integral considerations in the planning of U.S. signals intelligence activities. The United States shall not collect signals intelligence for the purpose of suppressing or burdening criticism or dissent, or for disadvantaging persons based on their ethnicity, race, gender, sexual orientation, or religion. Signals intelligence shall be collected exclusively where there is a foreign intelligence or counterintelligence purpose to support national and departmental missions and not for any other purposes.”

Notably for equal treatment of non-US persons, PPD-28 states: “Departments and agencies shall apply the term “personal information” in a manner that is consistent for U.S. persons and non-U.S. persons.” PPD-28 goes on to provide that dissemination, retention, and minimization rules should be consistent for US and non-US persons.  There is the possibility of exceptions for national security purposes, but a fair reading of PPD-28 is that it creates a major change in signals intelligence practices.  The rigor of its requirements was reinforced in the Interim Progress Report on Implementing PPD-28, released in October, 2014.  Implementation is due quickly, under a January, 2015 deadline.

In conclusion, PPD-28 and the DHS Privacy Policy Guidance Memorandum show important progress toward addressing concerns that the United States does not apply privacy protections to non-US persons.  The Privacy Act has its weaknesses, as Bob Gellman has recently explained in detail.  But that is no reason to exclude Europeans and other non-US persons from the protections that it does supply.

 

Peter Swire is the Huang Professor of Law and Ethics at the Georgia Tech Scheller College of Business. While at the U.S. Office of Management and Budget in 1999-2001, Swire was responsible for coordinating agency compliance with the Privacy Act of 1974.

Public Perceptions on Privacy

Today’s new report by the Pew Research Center gives the lie to the notion that privacy is unimportant to the average American. Instead, the big take away is that individuals feel like they lack any control over their personal information. These feelings are directed at the public and private sector alike, and suggest a profound trust gap is emerging in the age of big data.

While Pew has framed its report as a survey of Americans’ attitudes post-Snowden, the report presents a number of alarming statistics of which businesses ought take note. Advertisers take the brunt of criticism, and the entire report broadly suggests that public concerns about data brokers and the opacity of data collection are only growing. Seventy-one percent of respondents say that advertisers can be trusted only some of the time, and 16% say they never can. These numbers track every demographic group, and indeed, get worse among lower income households. Eighty percent of social network users are concerned about the information being shared with unknown third parties. Even as Americans are concerned about government access to personal information, they increasingly support more regulation of advertisers. This support is strong across an array of demographic groups.

Further, even as consumers remain willing to trade personal information in return for access to free Internet services, two-thirds of consumers disapprove of the suggestion that online services work due to increased access to personal information. More problematic, however, is that 91% of Americans now believe that “consumers have lost control over how personal information is collected and used by companies.” Though this Pew study does not show that privacy values are trumping digital services — and every indication suggests that they are not — it is a likely topic for Pew to return to in the future. It would be interesting to see whether this anxiety translates into action.

However, in the meantime, anxiety about privacy suggests an opportunity for companies to win with consumers simply by providing them with more control. Fully 61% “would like to do more” to protect their online privacy. We have repeatedly called for efforts to “featurize” data and have supported efforts to help consumers engage with their personal information. Many companies already provide meaningful controls on the collection and use of personal information, but the challenge is both making consumers aware of these options and ensuring that taking advantage of these dashboards and toggles is as fun as using a simple app.

So we need more tools to make privacy fun. And industry may also need to a better job staying attuned to consumer preferences. Pew reiterates how context-dependent privacy is, and that the value of privacy and consumer interest in protecting their privacy can vary widely from person to person, in different contexts and transactions, and perhaps most pointedly, in response to current events. “[U]sers bounce back and forth between different levels of disclosure depending on the context,” the report argues.

The challenge is ensuring that context is understood similarly by all parties. Part of this is understanding where and when personal information is sensitive. This is a debate that was highlighted at the FTC’s recent big data workshop, and is a theme that increasingly arises in conversations about big data and civil rights. Aside from Social Security numbers, which 95% of respondents considered to be sensitive information, data ranging from health information and phone and email message content to location information and birth date could be viewed as sensitive depending upon the context.

Depending upon context, everything is sensitive or nothing is sensitive. Obviously, this can be a tricky balancing act for consumers to manage. Information management requires users to juggle different online personas, platforms, and audiences. Thus, the door is open for companies to both take certain information off the table — or make a better case why some sensitive information is invaluable for certain services.

While Pew has not shown whether these privacy anxieties trump other pressing economic or social concerns, the report also suggests that the Americans’ perceptions of privacy are heavily intertwined with their understanding of security. Privacy may be amorphous, but security is less so — but being proactive on the one can often be a boon to the other. Positive and proactive public actions on privacy are essential if we are to reverse Americans’ doubts that they can trust sharing their personal information.

-Joseph Jerome, Policy Counsel

Debating the FBI on Phone Encryption

FBI Director James Comey has heated up the encryption debate with his recent appearances on Sixty Minutes and at the Brookings Institution.  Comey has sharply criticized Apple and Google for the companies’ announcements that they would enable strong encryption on their phones.  In contrast to prior practice, the companies would no longer keep a key to gain access to the encrypted content.  I applaud the companies’ announcements, which among other virtues will strengthen cybersecurity.

On November 17th, I have been invited to debate this issue at the New America Foundation, from 4:00 to 5:30 p.m., with webcast planned.  Nancy Libin, formerly of both the Center for Democracy and Technology and the U.S. Department of Justice, will be the moderator.  The opposing perspective will be offered by Andrew Weissmann, who until 2013 was General Counsel at the FBI.  I believe this will be the highest-profile live debate on the issue since Comey began his statements.

To remind us of the issues at stake, this post highlights four items I have previously worked on about encryption and global communications policy, the first three of which were supported by the Future of Privacy Forum’s project on government access to data in 2011-2013.

First, and perhaps most readably, is “Going Dark vs. the Golden Age of Surveillance.” (2011). This piece challenges the FBI’s claims that it is “going dark” due to encryption and other changes in communications technology.  Instead, Kenesa Ahmad and I argue that a better image would be that we are in “a golden age of surveillance.”  Compared with earlier periods, surveillance capabilities have greatly expanded.  Government agencies have unprecedented access to location information now that we all carry cellphones.  Information about contacts, confederates, and conspirators has massively expanded, as all of our texts, emails, and social network postings are saved by communications carriers.  In addition, there are myriad new databases that create digital dossiers about our lives.  In short, if government agencies were offered the choice of current capabilities or pre-Internet capabilities, they would overwhelmingly prefer their surveillance abilities today.  This piece was written before the Snowden leaks, so the idea of law enforcement and intelligence agencies “going dark” is even less plausible today.

Second, this going dark discussion was part of a larger research project on “Encryption and Globalization” (2012).  This lengthy article provides background on a variety of encryption issues, including developments in India and China.  One claim of the article is that strong encryption is even more vital in today’s globalized world than during the crypto wars of the 1990’s.  A second claim concerns what we call “the least trusted country problem.”  If there are backdoors or limits on effective encryption, then the security of global communications is only as strong as the security in the least trusted country.  Other countries will demand the same backdoors available to the U.S. government.  When the FBI or other agencies argue for weak security, we should consider the effects of surveillance by these other countries, many of whom lack the legal safeguards in the United States.

Third is my 2012 article “From Real-Time Intercepts to Stored Records: Why Encryption Drives the Government to Seek Access to the Cloud.”  This paper, as the title suggests, explains how changing technology is pushing government agencies to go to cloud providers for law enforcement and intelligence purposes.  Relevant to the Comey debate, and as explained in detail by Chris Soghoian, cloud providers hold an enormous wealth of potential evidence.  Even if police have difficulty getting into a smartphone, the relevant evidence very often is available from the cloud provider.

Fourth is the discussion of encryption in the report of President Obama’s Review Group on Intelligence and Communications Technology, for which I was one of five members.  The report in general, and our Recommendation 29 in particular, emphasizes why U.S. government policy should strongly encourage the use of effective encryption.

Andrew Weissmann, in addition to his role at the FBI, is an experienced litigator and former chief of the Enron Task Force.  He is a Senior Fellow at the NYU School of Law and its Center for Law and Security.  I look forward to a vigorous debate.

Peter Swire is Senior Fellow at the Future of Privacy Forum and the Huang Professor of Law and Ethics at Georgia Tech Scheller College of Business.