Swimming in the Big Data Ocean

Even as it promises breakthroughs in healthcare, the environment, and how individuals understand the world, Big Data may also be a powerful tool in the national security space. On Wednesday, the Journal of National Security Law & Policy, along with the Georgetown Center on National Security and the Law, launched their first symposium by addressing the fundamentals of Big Data and looking at how to establish a policy framework for its use.  While video promises to be available soon, I thought summarizing how the event touched on general privacy concerns might be helpful.

Recognizing the tension between a conservative legal profession seeking guidelines and agile technological development, the day began with a panel free of lawyers and full of technologists to address what Big Data is. Palantir’s Matthew Gordon suggested that Big Data is actually a misnomer. Instead, “big data” is simply making insights accessible for the benefit of users rather than purely for the benefit of isolated, discrete databases. Echoing the strategy of Privacy by Design, Gordon recommended organizations “bake in privacy from the start” when working with data.

Comparing today’s Big Data to the revolution caused by the Ford Model T, Johns Hopkins’ Sean Fahey declared that Big Data was important because it had “democratized” data. Today, thanks to cheap storage, excess computational power, and open source initiatives, enterprises of any scale can work with large amounts of data, eliminating the specialized equipment and significant capital investment that was once required to do serious data analytics. This has encouraged the collection of more and more data. Providing an observation that many would return to during the symposium, Fahey noted that today’s technological environment has actually made it costlier to decide to throw data away rather than simply to collect as much as possible.

While security guidelines are often discussed, particularly in the realm of national security, the challenge is that no one has addressed what sort of privacy protections we intend to implement. “We cannot ask a computer to answer whether some analytics violates privacy,” said Professor Daniel Weitzner, an FPF advisory board member, arguing that the problem of Big Data is determining how to inject “human judgment” into privacy questions. Unfortunately for the law–and for the lawyers advising clients about the law–this problem poses difficult ethical questions.

Graphic by Robert O’Harrow of The Washington Post.

So what sort of legal and policy framework should be established to protect the privacy issues raised by Big Data collection, storage, and analysis? The challenges standing in the way are significant.

Professor Paul Ohm suggested that the “democratization” of data produced a fundamental shift from public data collection to private data collection. At the same time, Big Data has blurred the distinction between public and private.  For the Federal Trade Commission, it finds itself in the position of trying to police changes to industry privacy policies, while the national security space places the federal government in the position of also having to change its privacy policies. Professor Laura Donohue wondered whether our federated system, divided between private industry and government, is always more helpful for protecting privacy today. Pointing to cybersecurity challenges, she cautioned that walls between public and private data could actually harm individual privacy when breaches occur.

Jennifer Granick from Stanford’s Center for Internet and Society argued against any notion that Big Data had somehow created a level playing field for individuals. “There’s no parity for individuals accused of crimes,” she noted, and the benefits of all this information are weighted toward private companies and government rather than the individuals producing the new streams of data. In September, FPF will partner with the Center for Internet and Society to address this question and how to bring Big Data and privacy together at an event. More information is available here.

After considerable discussion about the national security implications of Big Data, the symposium concluded with a recognition that modern privacy law is about establishing a degree a control vis-a-vis third parties, as Marc Rotenberg of the Electronic Privacy Information Center put it.  How that can best be done in a Big Data world is the big question. Citing Microsoft’s “Scroogled” campaign, Elisebeth Cook, a member of the Privacy and Civil Liberties Oversight Board and an attorney at WilmerHale, proposed market-based solutions to privacy infractions. Pointing to California’s stronger privacy rules, a number of privacy advocates suggested forum shopping could produce privacy gains. Moreover, in the face of the European Union’s proposed Data Protection Regulation, there was a recognition that privacy policies may be effectively offshored.

Of course, while it is often taken as a given that EU is working to strengthen privacy protections, Prof. Weitzner noted that the EU was less effective at enforcing its rules. Furthermore, while many Big Data players have their eyes on Europe, as privacy gets offshored, foreign legal system may actually end up producing weaker privacy protections for individuals over the long-term. “Perhaps an international solution is the answer?” someone in the audience offered. Either way, the ocean of Big Data continues to rise.

Mobile and the Connected Car

At this week’s DC Mobile Monday event, the potential of the connected car provided enthusiastic discussion.  As our cars get smarter, they promise not only seamless infotainment and geolocational tracking features but also the capacity to communicate with other connected cars and our mechanics, improving safety and saving drivers time and money. While these are considered luxury features today, the sentiment in the room was that most of these technologies would soon be commonplace in all vehicles, aided and augmented by our current smartphones and tablets.

Telcos seem to agree, increasingly identifying cars as lucrative growth opportunities. “[Cars are] basically smartphones on wheels,” AT&T’s Glenn Lurie explains, and indeed, many automakers see smartphones as an integral part of creating connected cars. One topic missing from the discussion, however, is how the synthesis of mobile technology and our vehicles impact individual privacy. We continue to balance the privacy challenges and data opportunities presented by smartphones, and we have only just begun to address the similar concerns presented by connected cars.

The consensus at the DC Mobile Monday event was that privacy was a secondary concern, taking a backseat to more practical hurdles like keeping drivers’ eyes on the road and legal liability issues. Indeed, panelists were quick (perhaps too quick) to suggest that privacy concerns were merely a generational problem, and that younger drivers simply do “not think deeply about privacy.”  Moreover, panelists were optimistic that most consumers would willingly trade their privacy once they understood the benefits connected cars can brings, such as lower insurance costs and a safer driving experience.

In the long term, the benefits of connected cars are real and drivers may eagerly trade their privacy in exchange, but these benefits need to be clearly communicated. That is a discussion that has yet to be had in full.

Do Not Track May Be Back on Track

In November 2012, the World Wide Web Consortium (W3C) named FPF Senior Fellow Peter Swire as co-chair of the W3C’s Tracking Protection Working Group, a multistakeholder group working to create a meaningful Do-Not-Track (DNT) mechanism.  At the time that Peter was appointed, the DNT negotiations were described by the New York Times as having “devolved into acrimonious discussions, name calling, and witch hunts.”  Industry representatives were accusing privacy advocates of attempting to undermine the online advertising ecosystem, and advocates were accusing industry of acting in bad faith.

This week, Peter wrote a blog post reporting that the working group has developed a roadmap that may lead to the public release of a DNT proposal by summertime.  Stakeholders have agreed on four criteria for a DNT standard: 1) it must be created through the W3C; 2) it must be consistent with the working group’s charter (which includes expectations for technical specifications and compliance mechanisms); 3) the standard should be a significant change from the current DNT situation; and 4) the working group must be able to explain why a user’s choice to activate DNT would reduce tracking.

Click here for Swire’s post.

 

Big Data and Privacy: Same Old Concerns or Something New?

Even as Big Data is used to chart flu outbreaks and improve winter weather forecasts, Big Data continues to generate important policy debates. Watching businesses and advocates argue over the use of “data” to measure human behavior in order to cut through both political ideology and personal intuition, David Brooks declares in The New York Times that the “rising philosophy of the day . . . is data-ism.” Writing for GigaOM, Derrick Harris responds that Brook’s concerns over data-worship are “really just statistics, the stuff academicians and businesspeople have been doing for years.”

The basic collection of data is nothing new. However, Harris makes the point that there is a considerable difference between “just plain data” and the rise of Big Data. Similarly, this raises the question of whether the privacy concerns swirling around Big Data differ in substance from the privacy issues we have long faced in the collection of personally identifiable information rather than merely in scale. In other words, what technological changes presented by Big Data raise novel privacy concerns?

The substance of Big Data is its scale. As our ability to collect and store vast quantities of information has increased, so too has our capacity to process this data to discover breakthroughs ranging from better health care, a cleaner environment, safer cities, and more effective marketing.  Privacy advocates argue that it is the scale of data collection that can potentially threaten individual privacy in new ways.  According to the Jay Stanley, Senior Policy Analyst at the ACLU, Big Data amplifies “information asymmetries of big companies over other economic actors and allows for people to be manipulated.” Data mining allows entities to infer new facts about a person based upon diverse data sets, threatening individuals with discriminatory profiling and a general loss of control over their everyday lives. Noting that credit card limits and auto insurance rates can easily be crafted on the basis of aggregated data, tech analyst and author Alistair Croll cautions that individual personalization is just “another word for discrimination.” Advocates worry that over time, Big Data will have potentially chilling effects on individual behavior.

With its proposed new General Data Protection Regulation, European policymakers propose to advance privacy by limiting uses of Big Data when individuals are analyzed.  The regulation’s most recent draft proposal, drafted by Jan Philipp Albrecht, Rapporteur for the LIBE Committee,  restricts individual profiling, which is defined as “any form of automated processing of personal data intended to evaluate certain personal aspects relating to a natural person or to analyse or predict in particular that natural person’s performance at work, economic situation, location, health, personal preferences, reliability or behaviour.” This sort of limit on “automated processing” effectively makes verboten much of the data that scientists and technologists see as the future of Big Data.

The fundamental problem is that neither individuals nor business, nor government for that matter, have developed a comprehensive understanding of Big Data. As a result, no one has actually balanced the costs and benefits of this new world of data. Individuals are still largely uninformed about how much data is actually being collected about them. They do not read nor understand lengthy privacy policies, but worry that their information is being used against them rather than on their behalf.  Meanwhile, business is struggling to balance new economic opportunities against the “creepy factor” or concerns that data is somehow being misused. Sometimes consumers adjust to the new stream of data (Facebook’s Newsfeed), and other times they simply do not (Google Buzz).

Kord Davis, a digital strategist and co-author of The Ethics of Big Data, notes that there is no common vocabulary or framework for the ethical use of Big Data. As a result, individuals and business, along with advocates and government, are speaking past one another.  The result is a regime where entities collect data first and ask questions later. Thus, when Big Data opportunities and privacy concerns collide, important decisions are made ad hoc.

The Future of Privacy Forum’s Omer Tene and Jules Polonetsky have previously called for the need to develop a model where Big Data’s benefits, for businesses and research, are balanced against individual privacy rights. To continue to advance scholarship in this area, FPF and the Stanford Center for Internet and Society invite authors to submit papers discussing the legal, technological, social, and policy implications of Big Data. Selected papers will be published in a special issue of the Stanford Law Review Online and presented at an FPF/CIS workshop, which will take place in Washington, DC, on September 10, 2013.  More information is available here.

FPF Breakfast & Conversation with Dr. Anne Cavoukian

At a time when policy makers around the world are supporting the concept of Privacy by Design, how can companies operationalize this key standard?

Future of Privacy Forum invites you to join us for breakfast and conversation with Dr. Ann Cavoukian, Information and Privacy Commissioner for Ontario, and leading privacy experts.

Agenda

8:30-9:00 a.m.     Breakfast

9:00-9:45 a.m.     Introduction by Karen Zacharia, Verizon

Remarks by Commissioner Cavoukian

9:45-10:30 a.m.   Panel Discussion followed by Q&A

Participation is free of charge, but please register here as space is limited.

Government employees attending the event should ensure that applicable government ethics rules permit them to accept these refreshments.  If you need to make arrangements to pay the cost of the refreshments, please contact: [email protected]

The momentum behind Privacy by Design (PbD), the International Standard for Privacy and Data Protection, has grown exponentially over the past two years. Dr. Ann Cavoukian, has partnered with leading organizations across diverse sectors including telecommunications, healthcare, transportation, mobile and energy to operationalize Privacy by Design and make it real!  PbD’s seven Foundational Principles have now been widely implemented to clearly demonstrate the effectiveness of taking a proactive approach to privacy. In November 2012, for the first time, she brought together these real-world experiences, lessons learned, and success stories in one guidance document – Operationalizing Privacy by Design: A Guide to Implementing Strong Privacy Policies. This new resource will serve as a model to others to take up the challenge and continue to build upon the growing knowledge base and PbD best practices.

As an important next step in the evolution of Privacy by Design, Commissioner Cavoukian is co-chairing a new Technical Committee of the standards body OASIS: Privacy by Design Documentation for Software Engineers, to facilitate privacy governance processes in organizations that conduct software development. The resulting documentation will serve to guide software development without diminishing any system functionality.

 

Privacy by Design: From Concept to Reality Breakfast and Conversation with Dr. Ann Cavoukian and Privacy Leaders

At a time when policy makers around the world are supporting the concept of Privacy by Design, how can companies operationalize this key standard?

Future of Privacy Forum invites you to join us for breakfast and conversation on Tuesday March 5, 2013 with Dr. Ann Cavoukian, Information and Privacy Commissioner for Ontario, and leading privacy experts. 

 

Agenda

8:30-9:00 a.m.     Breakfast

9:00-9:45 a.m.     Introduction by Karen Zacharia, Verizon followed by remarks by Commissioner Cavoukian

9:45-10:30 a.m.   Panel Discussion followed by Q&A

 

1/30/2013 – Privacy by Design allows secure third party access energy innovations to grow – Electric Light & Power

Canada’s Information and Privacy Commissioner, Dr. Ann Cavoukian, and Jules Polonetsky examine the benefits as well as the potential risks that the third party access to CEUD issue possesses. The new product will support conversation and new market opportunities.

To learn more about CEUD and the Privacy by Design approach click here.