Domestic Drones Should Embrace Privacy by Design

On Wednesday, the FAA held an online forum to seek input from members of the public on the agency’s development of a privacy policy for unmanned aircraft systems, or civilian drones. For two hours, privacy advocates, engineers, and representatives of the unmanned aircraft industry went around in circles debating whether drones even present novel privacy questions–and whether the FAA was the appropriate government agency to conduct such a conversation. If the unmanned aircraft industry wishes to encourage the widespread societal embrace of this technology, suggesting that drones do not present privacy challenges and moreover, arguing that our current legal and policy framework can adequately address any concerns is counterproductive.

Drones Are Different

As the Associated Press reported last week, public fear that unmanned aircraft technology will be misused threatens the health of the entire unmanned aircraft industry. Robert Fitzgerald, CEO of The BOSH Group, provides drone support services, was quoted as saying that the industry’s “lack of success in educating the public about unmanned aircraft is coming back to bite us.”

While it may be true as a technical matter that unmanned aerial surveillance is no different than a manned overhead flight, the privacy implications are worlds apart. As a practical consideration, unmanned aircraft are degrees cheaper and more accessible to use than their manned counterparts. The ACLU’s Jay Stanley has suggested that unmanned aircraft erase “natural limits” of aerial surveillance, and as drones become both smaller and more technically advanced, will pose bigger and bigger challenges to individual privacy.

But what truly makes unmanned aircraft so unique is that they provide a physical manifestation of our generally abstract, mental conceptions about privacy. Professor Ryan Calo surmises that drone surveillance is “visible and highly salient” in a ways that people experience quite different from network surveillance or commercial data brokerage. “People would feel observed, regardless of how or whether the information was actually used,” he explains.

Privacy Approaches to Unmanned Aircraft Systems

The Association for Unmanned Vehicle Systems International (AUVSI) has put forward a broad privacy statement that endorses efforts to ensure unmanned aircraft are used in an accountable and transparent fashion. So far, so good. However, the statement also calls for technology neutral policies. In other words, data collected from unmanned aircraft would be treated no differently from information uncovered from manned aircraft–or mobile phones. Additionally, while AUVSI has embraced limits on information collection, storage, use and sharing, it recommends enforcement via “established law and policy.” This might not be such a problem if the United States had more comprehensive privacy protections in place, but as Professor Calo and others have pointed out, there are few privacy laws that actually limit surveillance by either private or public parties.

Thus, because of this reality, it is problematic for organizations like AUVSI to suggest, as it did on Wednesday, that the solution is to trust the judicial system to sort out any privacy issues that may arise. Relying on either the traditional privacy torts or the Department of Justice to somehow police privacy intrusions by private companies is not only inefficient, but it does nothing to address the public’s broader concerns about unmanned aircraft. AUVSI claims to want a broad, society-wide discussion about privacy, but it fails to recognize that its own technology may well be the catalyst that forces us to readdress our privacy laws.

Alleviating these fears should be the industry’s top priority should it wish to see the projected economic boom from unmanned aircraft come to fruition. It may make sense to redirect this conversation to an agency with more substantive privacy expertise, but that will only further delay a policy discussion that is already behind where our technology is moving. As unmanned aircraft technology advances, it faces a patchwork of different laws and regulations across the country. A legislative fix by Congress is unlikely, and moreover, Congress has specifically mandated that the FAA work to safely integrate drones into our national airspace.

Given the slim likelihood of legislative action, stakeholders are more or less stuck with the FAA.  Thus, it is essential that the FAA work to develop guidelines that encourage public trust and confidence. The industry’s current approach is unlikely to accomplish this, so how can we best ensure the development of unmanned aircraft technology in a way that protects privacy? One strategy is to couple aircraft safety with privacy protections, and a number of mechanisms put forward by privacy advocates, such as metadata transmissions or “drone license plates,” would promote safety, as well. Another strategy is to develop policies that are informed by the Fair Informational Practice Principles (FIPPs), and for its part, this is the approach the FAA has suggested so far.

Incentives to Embrace Privacy by Design

Data minimization, security, transparency, and accountability are all important principles to respect, but one way of operationalizing these principles in the context of unmanned aircraft is to embrace the concept of Privacy by Design. Developed by Dr. Ann Cavoukian, the Information and Privacy Commissioner of Ontario, Canada, Privacy By Design encourages organizations to build privacy in–early, robustly and systematically–across products and business ecosystems.

According to the Federal Trade Commission, Privacy by Design requires entities to “promote consumer privacy throughout their organizations and at every stage in the development of their products and services.” Applying this notion to the field of robotics, researcher Aneta Podsiadła has suggested that privacy protections can be operationalized through a combination of technical solutions during product development and “embedding privacy” into an organization’s operation. Unmanned aircraft manufacturers and operators do not appear to be seriously thinking about privacy from either perspective, however.

Ironically, the vocal public concern about drones actually combats one of the biggest challenges to implementing Privacy by Design. Often economic incentives to protect privacy are simply inadequate. Privacy scholar Ira Rubinstein explains that this combines with inexact guidance by regulators on how to implement Privacy by Design to make investing in privacy safeguards costly to firms. In the case of unmanned aerial surveillance, however, public demand for privacy safeguards is salient–and indeed, an economic opportunity.  Already firms are developing surveillance “countermeasures” for sale to the general public.

This provides an opening to make the FAA’s privacy proposals a model for future privacy policies and operationalizing Privacy by Design. Both regulators and industry needs to begin elaborating design principles, discussing best practices, and researching how privacy can be engineered into unmanned aerial systems. Absent an ongoing dialog, we are committing ourselves to privacy protections that are more aspiration than reality in the skies above. All parties have every incentive to consider these issues: the drone industry anticipates adding 70,000 high-tech jobs and $14 billion to the economy by mid-decade. If we hope to see those figures come to fruition, everyone should be working with the FAA to encourage innovation and experimentation with privacy-protecting technologies.

How Obscurity Could Help the Right To Fail

In a post on Policy@Intel, David Hoffman explains why Internet obscurity can help the “Right to Fail.”  Absent providing individuals with “a sphere of privacy where they know they can make mistakes,” society may make it impossible for individuals to pursue ideas that “challenge the status quo” and are needed “to break away from conformity and innovate.”

He also highlights Woodrow Hartzog and Evan Selinger’s suggestion that obscurity might actually be better than privacy when looking at conceptual tools to protect personal information.

 

Increasing Calls for a Big Data Dialog

Big Data promises to open new doors to curing diseases, cleaning the environment, and easing life’s burdens, but is it opening too many doors?  Writing for The New York Times on Sunday, Steve Lohr suggested that the privacy challenges posed by Big Data are so large that it might trump any potential benefits.  The surveillance possibilities permitted by data today, he noted, “could leave George Orwell in the dust.”

Whether privacy challenges should trump how we use data or vice versa, there is an obvious need for organizations and society at large to address what we hope to achieve with Big Data—and what we are willing to take off the table. In that spirit, FPF is joining with the Stanford Center on Internet and Society to host a day-long event this fall to tackle how best to bring together the value of data with the value of personal privacy. There should be some room for agreement.  As Lohr notes, “corporate executives and privacy experts agree that the best way forward combines new rules and technology tools.”

Yet Big Data may require us to have a bigger discussion about how personal data is used.  In response to research posted on Monday that concluded that anonymized data in the context of day-to-day location tracking can be re-identified with relative ease, David Mayer declares on GigaOM today that we need a “new realpolitik” for data privacy:

We are not going to stop all this data collection, so we need to develop workable guidelines for protecting people. Those developing data-centric products also have to start thinking responsibly – and so do the privacy brigade. Neither camp will entirely get its way: there will be greater regulation of data privacy, one way or another, but the masses will also not be rising up against the data barons anytime soon.

As Mayer concedes, it is impossible to stop our ever-increasing data collection capabilities, and even if we could, it would likely be to our greater detriment.  While our legal constructs tend to view privacy as a binary all-or-nothing concept, our Big Data reality suggests that privacy be viewed as a spectrum in which benefits are weighed against the specter of the dictatorship of data. Who is doing what, for what purpose and for what benefit are important considerations, and it is past time for policy makers to begin engaging with these questions in earnest.

Big Data and Privacy: Making Ends Meet Conference

Big Data and Privacy: Making Ends Meet

Solutions to many pressing economic and societal challenges may be found in better understanding data, from safer cities to cleaner air, but as the amount and variety of data collection continues to increase, our data-driven society also poses serious concerns about infringements on privacy.  The need for a way forward is evident, and both corporate executives and privacy experts see a solution in a mixture of new rules and new technological tools.

In the spirit of bringing together Big Data and Privacy, the Future of Privacy Forum (FPF) and the Stanford Center for Internet and Society (CIS) organized a day-long workshop on September 10th, 2013 at:

Microsoft Innovation and Policy Center

901 K Street NW

Washington, DC 20001

Rayid Ghani, co-founder of Edgeflip, an analytics startup building social media analytics products for non-profits and social good organizations, and the former Chief Scientist at the Obama for America 2012 campaign, provided a keynote address.

Jennifer Stoddart, Privacy Commissioner of Canada, provided brief closing remarks.

Tentative Schedule:

This event was preceded by a call for papers, which were selected into four topic areas.

9:00-9:05                           Quick Introduction

9:05-10:15 AM                  Framing Big Data and Privacy

General discussion of policy frameworks and thinking about Big Data.

 

Other recommended papers:

 

10:15-10:30                      Coffee break

10:30-11:45 AM             Social Ramifications of Big Data

Discussion about how society and social order is being changed by ubiquitous data collection and use.

 

11:45-1:00 PM                   Lunch + Keynote

 

1:00-2:15 PM                     Government Use of Big Data

Discussion about government use of data—what limitations should be put in place?  How can government use data better?

 

2:15-2:30                            Refreshment break

2:30-3:45 PM                   New Legal Regimes for Data Governance

Discussion about how law/regulations can be adapted to what’s “new” about Big Data. 

  • Dennis HirschProfessor of Law, Capital University Law School (discussing The Glass House Effect)
  • Justin Brookman, Director, Consumer Privacy, Center for Democracy & Technology (discussing Why Collection Matters with G.S. Hans)
  • Michael Donohue, Senior Policy Analyst, Organisation for Economic Co-operation and Development
  • Christian Fjeld, Senior Counsel, Senate Commerce, Science and Transportation Committee
  • Felix Wu, Associate Professor, Benjamin N. Cardozo School of Law (discussing Big Data Threats)
  • Christopher WolfCo-Chair, Future of Privacy Forum (Moderator)

 

Other recommended papers:

3:45-5:00 PM                      Technological Solutions – and Challenges

Discussion of Big Data’s challenges from a technical perspective.  How can technology and PETs be used to alleviate concerns about Big Data’s risks? And maximize its benefits?

 

Other recommended papers:

 

5:00 PM                   Closing Remarks

 

5:15 PM                   Reception at Microsoft Innovation & Policy Center

Generous sponsorship provided by the following:

Intel

TPP

Nielsen

For sponsorship opportunities, please contact Barbara Kelly at [email protected].

 

Swimming in the Big Data Ocean

Even as it promises breakthroughs in healthcare, the environment, and how individuals understand the world, Big Data may also be a powerful tool in the national security space. On Wednesday, the Journal of National Security Law & Policy, along with the Georgetown Center on National Security and the Law, launched their first symposium by addressing the fundamentals of Big Data and looking at how to establish a policy framework for its use.  While video promises to be available soon, I thought summarizing how the event touched on general privacy concerns might be helpful.

Recognizing the tension between a conservative legal profession seeking guidelines and agile technological development, the day began with a panel free of lawyers and full of technologists to address what Big Data is. Palantir’s Matthew Gordon suggested that Big Data is actually a misnomer. Instead, “big data” is simply making insights accessible for the benefit of users rather than purely for the benefit of isolated, discrete databases. Echoing the strategy of Privacy by Design, Gordon recommended organizations “bake in privacy from the start” when working with data.

Comparing today’s Big Data to the revolution caused by the Ford Model T, Johns Hopkins’ Sean Fahey declared that Big Data was important because it had “democratized” data. Today, thanks to cheap storage, excess computational power, and open source initiatives, enterprises of any scale can work with large amounts of data, eliminating the specialized equipment and significant capital investment that was once required to do serious data analytics. This has encouraged the collection of more and more data. Providing an observation that many would return to during the symposium, Fahey noted that today’s technological environment has actually made it costlier to decide to throw data away rather than simply to collect as much as possible.

While security guidelines are often discussed, particularly in the realm of national security, the challenge is that no one has addressed what sort of privacy protections we intend to implement. “We cannot ask a computer to answer whether some analytics violates privacy,” said Professor Daniel Weitzner, an FPF advisory board member, arguing that the problem of Big Data is determining how to inject “human judgment” into privacy questions. Unfortunately for the law–and for the lawyers advising clients about the law–this problem poses difficult ethical questions.

Graphic by Robert O’Harrow of The Washington Post.

So what sort of legal and policy framework should be established to protect the privacy issues raised by Big Data collection, storage, and analysis? The challenges standing in the way are significant.

Professor Paul Ohm suggested that the “democratization” of data produced a fundamental shift from public data collection to private data collection. At the same time, Big Data has blurred the distinction between public and private.  For the Federal Trade Commission, it finds itself in the position of trying to police changes to industry privacy policies, while the national security space places the federal government in the position of also having to change its privacy policies. Professor Laura Donohue wondered whether our federated system, divided between private industry and government, is always more helpful for protecting privacy today. Pointing to cybersecurity challenges, she cautioned that walls between public and private data could actually harm individual privacy when breaches occur.

Jennifer Granick from Stanford’s Center for Internet and Society argued against any notion that Big Data had somehow created a level playing field for individuals. “There’s no parity for individuals accused of crimes,” she noted, and the benefits of all this information are weighted toward private companies and government rather than the individuals producing the new streams of data. In September, FPF will partner with the Center for Internet and Society to address this question and how to bring Big Data and privacy together at an event. More information is available here.

After considerable discussion about the national security implications of Big Data, the symposium concluded with a recognition that modern privacy law is about establishing a degree a control vis-a-vis third parties, as Marc Rotenberg of the Electronic Privacy Information Center put it.  How that can best be done in a Big Data world is the big question. Citing Microsoft’s “Scroogled” campaign, Elisebeth Cook, a member of the Privacy and Civil Liberties Oversight Board and an attorney at WilmerHale, proposed market-based solutions to privacy infractions. Pointing to California’s stronger privacy rules, a number of privacy advocates suggested forum shopping could produce privacy gains. Moreover, in the face of the European Union’s proposed Data Protection Regulation, there was a recognition that privacy policies may be effectively offshored.

Of course, while it is often taken as a given that EU is working to strengthen privacy protections, Prof. Weitzner noted that the EU was less effective at enforcing its rules. Furthermore, while many Big Data players have their eyes on Europe, as privacy gets offshored, foreign legal system may actually end up producing weaker privacy protections for individuals over the long-term. “Perhaps an international solution is the answer?” someone in the audience offered. Either way, the ocean of Big Data continues to rise.

Mobile and the Connected Car

At this week’s DC Mobile Monday event, the potential of the connected car provided enthusiastic discussion.  As our cars get smarter, they promise not only seamless infotainment and geolocational tracking features but also the capacity to communicate with other connected cars and our mechanics, improving safety and saving drivers time and money. While these are considered luxury features today, the sentiment in the room was that most of these technologies would soon be commonplace in all vehicles, aided and augmented by our current smartphones and tablets.

Telcos seem to agree, increasingly identifying cars as lucrative growth opportunities. “[Cars are] basically smartphones on wheels,” AT&T’s Glenn Lurie explains, and indeed, many automakers see smartphones as an integral part of creating connected cars. One topic missing from the discussion, however, is how the synthesis of mobile technology and our vehicles impact individual privacy. We continue to balance the privacy challenges and data opportunities presented by smartphones, and we have only just begun to address the similar concerns presented by connected cars.

The consensus at the DC Mobile Monday event was that privacy was a secondary concern, taking a backseat to more practical hurdles like keeping drivers’ eyes on the road and legal liability issues. Indeed, panelists were quick (perhaps too quick) to suggest that privacy concerns were merely a generational problem, and that younger drivers simply do “not think deeply about privacy.”  Moreover, panelists were optimistic that most consumers would willingly trade their privacy once they understood the benefits connected cars can brings, such as lower insurance costs and a safer driving experience.

In the long term, the benefits of connected cars are real and drivers may eagerly trade their privacy in exchange, but these benefits need to be clearly communicated. That is a discussion that has yet to be had in full.

Do Not Track May Be Back on Track

In November 2012, the World Wide Web Consortium (W3C) named FPF Senior Fellow Peter Swire as co-chair of the W3C’s Tracking Protection Working Group, a multistakeholder group working to create a meaningful Do-Not-Track (DNT) mechanism.  At the time that Peter was appointed, the DNT negotiations were described by the New York Times as having “devolved into acrimonious discussions, name calling, and witch hunts.”  Industry representatives were accusing privacy advocates of attempting to undermine the online advertising ecosystem, and advocates were accusing industry of acting in bad faith.

This week, Peter wrote a blog post reporting that the working group has developed a roadmap that may lead to the public release of a DNT proposal by summertime.  Stakeholders have agreed on four criteria for a DNT standard: 1) it must be created through the W3C; 2) it must be consistent with the working group’s charter (which includes expectations for technical specifications and compliance mechanisms); 3) the standard should be a significant change from the current DNT situation; and 4) the working group must be able to explain why a user’s choice to activate DNT would reduce tracking.

Click here for Swire’s post.

 

Big Data and Privacy: Same Old Concerns or Something New?

Even as Big Data is used to chart flu outbreaks and improve winter weather forecasts, Big Data continues to generate important policy debates. Watching businesses and advocates argue over the use of “data” to measure human behavior in order to cut through both political ideology and personal intuition, David Brooks declares in The New York Times that the “rising philosophy of the day . . . is data-ism.” Writing for GigaOM, Derrick Harris responds that Brook’s concerns over data-worship are “really just statistics, the stuff academicians and businesspeople have been doing for years.”

The basic collection of data is nothing new. However, Harris makes the point that there is a considerable difference between “just plain data” and the rise of Big Data. Similarly, this raises the question of whether the privacy concerns swirling around Big Data differ in substance from the privacy issues we have long faced in the collection of personally identifiable information rather than merely in scale. In other words, what technological changes presented by Big Data raise novel privacy concerns?

The substance of Big Data is its scale. As our ability to collect and store vast quantities of information has increased, so too has our capacity to process this data to discover breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing.  Privacy advocates argue that it is the scale of data collection that can potentially threaten individual privacy in new ways.  According to the Jay Stanley, Senior Policy Analyst at the ACLU, Big Data amplifies “information asymmetries of big companies over other economic actors and allows for people to be manipulated.” Data mining allows entities to infer new facts about a person based upon diverse data sets, threatening individuals with discriminatory profiling and a general loss of control over their everyday lives. Noting that credit card limits and auto insurance rates can easily be crafted on the basis of aggregated data, tech analyst and author Alistair Croll cautions that individual personalization is just “another word for discrimination.” Advocates worry that over time, Big Data will have potentially chilling effects on individual behavior.

With its proposed new General Data Protection Regulation, European policymakers propose to advance privacy by limiting uses of Big Data when individuals are analyzed.  The regulation’s most recent draft proposal, drafted by Jan Philipp Albrecht, Rapporteur for the LIBE Committee,  restricts individual profiling, which is defined as “any form of automated processing of personal data intended to evaluate certain personal aspects relating to a natural person or to analyse or predict in particular that natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour.” This sort of limit on “automated processing” effectively makes verboten much of the data that scientists and technologists see as the future of Big Data.

The fundamental problem is that neither individuals nor business, nor government for that matter, have developed a comprehensive understanding of Big Data. As a result, no one has actually balanced the costs and benefits of this new world of data. Individuals are still largely uninformed about how much data is actually being collected about them. They do not read nor understand lengthy privacy policies, but worry that their information is being used against them rather than on their behalf.  Meanwhile, business is struggling to balance new economic opportunities against the “creepy factor” or concerns that data is somehow being misused. Sometimes consumers adjust to the new stream of data (Facebook’s Newsfeed), and other times they simply do not (Google Buzz).

Kord Davis, a digital strategist and co-author of The Ethics of Big Data, notes that there is no common vocabulary or framework for the ethical use of Big Data. As a result, individuals and business, along with advocates and government, are speaking past one another.  The result is a regime where entities collect data first and ask questions later. Thus, when Big Data opportunities and privacy concerns collide, important decisions are made ad hoc.

The Future of Privacy Forum’s Omer Tene and Jules Polonetsky have previously called for the need to develop a model where Big Data’s benefits, for businesses and research, are balanced against individual privacy rights. To continue to advance scholarship in this area, FPF and the Stanford Center for Internet and Society invite authors to submit papers discussing the legal, technological, social, and policy implications of Big Data. Selected papers will be published in a special issue of the Stanford Law Review Online and presented at an FPF/CIS workshop, which will take place in Washington, DC, on September 10, 2013.  More information is available here.

FPF Breakfast & Conversation with Dr. Anne Cavoukian

At a time when policy makers around the world are supporting the concept of Privacy by Design, how can companies operationalize this key standard?

Future of Privacy Forum invites you to join us for breakfast and conversation with Dr. Ann Cavoukian, Information and Privacy Commissioner for Ontario, and leading privacy experts.

Agenda

8:30-9:00 a.m.     Breakfast

9:00-9:45 a.m.     Introduction by Karen Zacharia, Verizon

Remarks by Commissioner Cavoukian

9:45-10:30 a.m.   Panel Discussion followed by Q&A

Participation is free of charge, but please register here as space is limited.

Government employees attending the event should ensure that applicable government ethics rules permit them to accept these refreshments.  If you need to make arrangements to pay the cost of the refreshments, please contact: [email protected]

The momentum behind Privacy by Design (PbD), the International Standard for Privacy and Data Protection, has grown exponentially over the past two years. Dr. Ann Cavoukian, has partnered with leading organizations across diverse sectors including telecommunications, healthcare, transportation, mobile and energy to operationalize Privacy by Design and make it real!  PbD’s seven Foundational Principles have now been widely implemented to clearly demonstrate the effectiveness of taking a proactive approach to privacy. In November 2012, for the first time, she brought together these real-world experiences, lessons learned, and success stories in one guidance document – Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Policies. This new resource will serve as a model to others to take up the challenge and continue to build upon the growing knowledge base and PbD best practices.

As an important next step in the evolution of Privacy by Design, Commissioner Cavoukian is co-chairing a new Technical Committee of the standards body OASIS: Privacy by Design Documentation for Software Engineers, to facilitate privacy governance processes in organizations that conduct software development. The resulting documentation will serve to guide software development without diminishing any system functionality.

 

Privacy by Design: From Concept to Reality Breakfast and Conversation with Dr. Ann Cavoukian and Privacy Leaders

At a time when policy makers around the world are supporting the concept of Privacy by Design, how can companies operationalize this key standard?

Future of Privacy Forum invites you to join us for breakfast and conversation on Tuesday March 5, 2013 with Dr. Ann Cavoukian, Information and Privacy Commissioner for Ontario, and leading privacy experts. 

 

Agenda

8:30-9:00 a.m.     Breakfast

9:00-9:45 a.m.     Introduction by Karen Zacharia, Verizon followed by remarks by Commissioner Cavoukian

9:45-10:30 a.m.   Panel Discussion followed by Q&A