Controlling the Future of Privacy

Last week, I was fortunate enough to see several cool new applications of location technology and social data at two conferences which bookended my week. Privacy issues were addressed at the end of each conference, which I understand: a lecture about privacy is the last thing entrepreneurs and researchers want to hear. Unfortunately, privacy can be something of a bummer — just today, Forrester Research made headlines with a report predicting a privacy “crackdown” in 2016.

Privacy has an image problem. It doesn’t help that while some specific privacy laws and regulations are quite clear, “privacy” as a concept remains amorphous and ill-defined. Rather than an ideology or a value, privacy is a tool in pursuit of those things. For many years, privacy has been defined along the lines of offering individuals control over their information. The 2012 White House Consumer Privacy Bill of Rights places a principle of individual control front and center, before any other consumer right, declaring that “[c]onsumers have a right to exercise control over what personal data companies collect from them and how they use it.”

Yet, if privacy is about control, what does it mean when people feel completely out of control when it comes to their digital footprint? Last fall, a Pew survey found that 91% of Americans believe they “have lost control over how personal information is collected and used by companies.” Our current conceptions of privacy as control doesn’t really work anymore. Provocateurs at these conferences noted that modern society has made it impossible to survive, let alone thrive without a smartphone. Going off the grid can, by itself, be suspicious. Having a LinkedIn profile can be a professional requisite.

At the Future of Privacy Forum, we have long called for efforts to “featurize” privacy. After spending a week seeing the many ways innovators could deploy data, we need to embrace creative, out-the-box ways to get consumers to think about how they can use and take advantage of their data online. Advances in web design and, more recently, app development have made everything from tracking personal finances to reading the text-heavy Harvard Law Review more enjoyable. There’s no reason design and functionality can’t also be used to make privacy more engaging.

Even small tweaks go far. Facebook, for example, recently featured a blue privacy dinosaur to help its users with a “privacy check-up.” More than 86% of Facebook users seeing the tool actually completed the entire privacy check-up, and Facebook suggested that the dinosaur “helped make the experience a little more approachable and a little more engaging.” Presenting users with a privacy check-up is easier than asking them to wade through a myriad of privacy settings of their own volition. Putting these simple tools right in front of user eyeballs not only makes privacy more approachable, but perhaps more salient.

Getting privacy right will only become more important. Increasingly, a trust deficit already exists when it comes to innovative uses of information. According to a presentation by Susan Etlinger from Altimeter Research, 45% of American consumers report having little or not trust in how organizations use their data. Getting users engaged with their data is only half the privacy battle, however.

Beyond featurization or “appifying” privacy in ways that gives users more control and ownership over their information, much more work needs to be done to provide more transparency and to build accountability measures around data use. As Steve Hegenderfer at Bluetooth described it, companies need to offer both good product and good policy. This Friday, I’ll be discussing my thoughts on what this looks like — and what the future of privacy holds in general at Privacy & Access 20/20 Conference in Vancouver.

A Way Forward for Social Media Research

Few would deny that technology and social media are changing the way we interact. People today can stay in touch with friends on Facebook, share vacation photos on Instagram, follow trends on Twitter, grow their networks on LinkedIn, and explore communities on Reddit. And people are staying connected wherever they go. The Pew Research Center recently reported that the percentage of U.S. adults who own smartphones has nearly doubled from four years ago to 68%. For U.S. adults between the ages 18 and 49 the figure is over 80%.

In her new book, Reclaiming Conversation: the Power of Talk in the Digital Age, MIT professor Sherry Turkle examines how technology is eroding our ability to meaningfully connect with one another. She argues that technology’s presence disrupts our conversations in subtle ways, and in the process we lose out on opportunities for deeper human interaction. She cites experimental research showing that the mere presence of a smartphone on a table when people are talking decreases their feelings of empathy and the amount of personal information they share. Professor Turkle does not argue that we boycott our devices entirely—she describes herself as pro-technology. Instead, she suggests that we make space for genuine conversation, uninterrupted by our devices.

Professor Turkle is not alone in warning of technology’s impact on human interaction. Studies have examined the possible link between the use of Facebook or other social media and envy or depression. But others argue that concerns are exaggerated. Madeleine George and Candice Odgers of Duke University looked at the effects of social media on teens and found little evidence of a negative impact—other than a disruptive effect on sleeping habits. Clearly, researchers have yet to reach a consensus on the effects of social media. If we hope to realize the benefits of digital connectivity while avoiding its potential harms, more research will be necessary. This research presents its own concerns, however. Many will remember the debate about Facebook’s collaboration with researchers to study the effects of showing users more sad stories from their connections.

But Facebook is not alone, as companies increasingly conduct data-driven research and consumer testing to tailor products and practices in the Internet’s crowded landscape. Such research could help us understand the impact of social media on its users and on society. At the same time, social media research often involves intimate details about people’s lives. Because companies need not follow federal standards governing human subjects experiments for internal research, this research has little ethical oversight and is rarely published. Thus, the ethical review that studies undergo is not clear and the findings they produce do not contribute to public knowledge.

To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,” formal privacy review boardsprivate IRBs, and other ethical processes implemented by individual companies. Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.

To address these challenges, the Future of Privacy Forum is hosting an invitation-only academic workshop supported by the National Science Foundation and the Alfred P. Sloan Foundation on Dec. 10, 2015, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Leading papers will also be selected for presentation at the workshop and for publication in the online edition of the Washington and Lee Law Review.

Sherry Turkle writes that we have “become accustomed to seeing life as something we can pause in order to document it, get another thread running in it, or hook it up to another feed.” Perhaps, with a way forward for meaningful research on social media and digital connectivity, we can decide whether the world Professor Turkle describes should be fought, navigated, or embraced.

EdSurge Carries Piece on Best Practices for Student Privacy

October 30 – EdSurge, a independent online resource on education technology, has  a piece on student data privacy written by FPF’s own Jules Polonetsky and Brenda Leong.  Polonetsky and Leong comment on a “trust gap” between parents and schools when it comes to the collection, use, and security of student data, and discuss the best practices schools and their affiliate service providers can employ to bridge that gap.  They write “…companies and schools can work together to close by demonstrating increased understanding of parental concerns, strong policies and practices, and good communications among all stakeholders.”

 

To read the piece in its entirety, click HERE: https://www.edsurge.com/news/2015-10-30-getting-student-data-security-right-the-first-time

FPF and Washington & Lee University Law School Announce Partnership

DC-BASED PRIVACY THINK TANK FUTURE OF PRIVACY FORUM PARTNERS WITH WASHINGTON and LEE UNIVERSITY SCHOOL OF LAW TO CREATE UNIQUE ACADEMIC-PROFESSIONAL PARTNERSHIP

Affiliation to Advance Privacy Scholarship, Create Business/Academic Ties, and Incubate Tomorrow’s Privacy Lawyers

WASHINGTON, D.C. & LEXINGTON, Va. – Thursday, October 29, 2015 – The Future of Privacy Forum (FPF) and Washington and Lee University School of Law today announced a unique strategic partnership designed to enrich the legal academic experience and to enhance scholarship and conversations about privacy law and policy.

The FPF/W&L Law collaboration will:

“This partnership is such a great opportunity to combine the resources and talent of a top-tier law school with the mission and objectives of a privacy-focused think tank,” said Christopher Wolf, co-chair of FPF. “FPF policy staff and fellows and W&L Law students and faculty already are working together on issues such as the privacy of data collected by connected cars and the ethical review processes for big data. As a 1980 graduate of W&L Law, I am so pleased to have brought together my law school with the Future of Privacy Forum, the think tank I founded in 2008.”

W&L Law Dean Brant Hellwig said “Through this partnership, we will expand our footprint in Washington, creating even more opportunities for our students in Lexington and in the D.C. program.” “It also leverages our growing faculty expertise in privacy and national security law, so we can have a larger impact on policy deliberations.”

FPF Executive Director Jules Polonetsky added: “We are thrilled that as another feature of the partnership, W&L Law professors Margaret Hu and Joshua Fairfield will serve on the FPF Advisory Board. Professor Hu is well-known for her research on national security, cyber-surveillance and civil rights, and her recent writing on government use of database screening and digital watch listing systems to create “blacklists” of individuals based on suspicious data. Professor Fairfield is an internationally recognized law and technology scholar, specializing in digital property, electronic contract, big data privacy, and virtual communities.”

On Thursday, November 5, FPF and W&L Law are celebrating the partnership, along with the opening of FPF’s new headquarters in Washington, with a panel discussion addressing the future of Section 5 of the FTC Act. Former FTC Consumer Bureau Director David Vladeck and James Cooper, former Acting Director, FTC Office of Policy Planning, will discuss key Section 5 issues – such as materiality, harm, the role of cost benefit analysis and other issues raised in the FTC’s privacy and data security actions. The program will take place from 5:00 p.m. to 7:00 p.m. and will be followed by an open house reception at FPF offices, 1400 Eye Street, N.W., Suite 450, Washington, D.C. 20005.

About Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups.

About Washington & Lee School of Law

Washington and Lee University School of Law in Lexington, Virginia is one of the smallest of the nation’s top-tier law schools, with an average class size of 22 and a 9-to-1 student-faculty ratio. The Law School’s commitment to student-centered legal education, emphasis on legal writing, and dedication to professional development is reflected in the impressive achievements of its graduates, which include seven American Bar Association presidents, 22 members of the U.S. Congress, numerous state and federal judges, and Supreme Court Justice Lewis F. Powell.

Media Contacts

Nicholas Graham, for FPF: 571-291-2967, [email protected]

Peter Jetton, for W&L Law: 540-461-1326, [email protected]

 

FPF Panel will Shine Light on FTC's Authority

Data privacy and security regulators don’t always agree. That’s no surprise to those observing the discussions that have followed the European Court of Justice’s decision to invalidate the adequacy of the EU-U.S. Safe Harbor framework.  But the disputes aren’t always global.  Sometimes regulators from the same country, working in the same agency, disagree about how to regulate data privacy and security issues.

To attend FPF’s event, click here: https://www.eventbrite.com/e/fpf-and-washington-lee-law-open-house-the-future-of-section-5-tickets-18886732726

Read the full release here: http://www.lexology.com/library/detail.aspx?g=01cc1996-41ec-4a78-b4f4-d12b78c9ba84

Cross-Device: Understanding the State of State Management

On Friday, October 16, the Future of Privacy Forum filed comments with the FTC in advance of the FTC’s Cross Device Workshop on Nov. 16, 2015. Jules Polonetsky and Stacey Gray have prepared a report, Cross-Device: Understanding the State of State Management, based on revisions to FPF’s comments filed with the FTC on October 16th, that aims to describe how and why the advertising and marketing industries are using emerging technologies to track individual users across platforms and devices.
 
In the first decades of the Internet, the predominant method of state management–the ability to remember a unique user over time–was the “cookie.” However, because of how cookies operate, via the web browser placing a data file onto a user’s hard drive, this model is becoming increasingly ineffective at tracking user behavior across different browsers and devices. The fact that modern users are now accessing online content and resources through a broadening spectrum of devices–e.g. laptop, smartphone, tablet, watch, wearable fitness tracker, television, and other internet-connected home appliances–is creating a real challenge for advertisers and marketers who seek to holistically analyze consumer behavior. In these comments, we explain the challenges and some of the emerging technological solutions, each of which presents nuanced differences in privacy benefits and concerns.
FPF will be continuing our work in this area, and we welcome any comments or reactions. The full paper is available to read here.
Be sure to check out a few of our helpful ad tech flow charts:
Data-Matching-Via-Websites-and-Apps-For-Blog-Post
 Ad-Networks-Home-Wifi-For-Blog-Post
 Understanding-Ad-Effectiveness-For-Blog-Post

CEA releases guiding privacy & security principles for wearable technologies

Yesterday, October 26th, the Consumer Electronics Association (CEA) announced voluntary guidelines for organizations that manage personal and health-related data, particularly as generated by consumer wearable technologies.  This step illustrates CEA’s  attempt to promote consumer trust in technology companies producing and supporting health trackers and other wearable technologies.  Future of Privacy Forum applauds CEA’s efforts to advance the interests of privacy and security in this new and rapidly expanding consumer technology.  FPF is separately working with a number of companies and other key stakeholders to produce detailed guidelines for wearables. Many of the key principles identified by CEA are addressed, as well as additional sections that focus on research and secondary use of consumer-generated data.

Click here to read CEA’s full list of guiding principles.

 

 

The Hill Features FPF's Comments on Safe Harbor

EU_Flags_162128453

 

Today the US political news website The Hill carried an opinion piece by Future of Privacy Forum staff on the EUCJ’s Safe Harbor ruling.  Executive Director Jules Polonetsky and Legal & Policy Fellow Bénédicte Dambrine write of the challenges the ruling creates for European companies, workers, students, and educational institutions, and asked that policymakers find a solution soon that strikes a balance between privacy and economic interests.

You can read the full piece HERE.

NHTSA & FTC Critical of House Vehicle Safety Proposal

October 14, 2015 — The House Energy and Commerce Subcommittee on Commerce, Manufacturing, and Trade met to discuss proposals to improve motor vehicle safety. Much of the hearing focused on a recent proposal by committee staff to incentivize the adoption of new technologies to improve vehicle safety, which raises several privacy issues. Specifically, privacy and cybersecurity protections were highlighted, as well as the roles of the National Highway Traffic Safety Administration (“NHTSA”) and the Federal Trade Commission (“FTC”). Chairman Fred Upton (R-MI) set the tone for this conversation with his opening statements, stating that “what was once science fiction, is today a reality.” Representative Marsha Blackburn (R-TN) emphasized that there will be a quarter billion connected cars by 2020.

The first panel consisted of testimony by representatives of NHTSA and the FTC. Both representatives were critical of certain aspects of the draft.

Dr. Mark R. Rosekind, Administrator of NHTSA, raised several concerns regarding the draft proposal during his testimony. First, he argued the proposal gives car manufacturers the power to create best practices that may undermine NHTSA’s mission. Though the draft requires automakers to submit these best practices to NHTSA, it does not give NHTSA the authority to accept or propose changes. Additionally, the draft does not provide NHTSA enforcement authority. Instead, the draft allows for fines of up to $1 million for violations of privacy policies. Rosekind argued, however, that these fines do not present sufficient deterrent effect. Second, Rosekind suggested that best practices should be developed in conjunction with NHTSA, not by an industry majority. The proposal establishes a council for the development of best practices that would be comprised of at least 50 percent of car manufacturer representatives.

Finally, he argued that NHTSA is better positioned to address vehicle technology concerns because of its experience in this area. NHTSA has a council for vehicle electronics, which has been looking at these issues since 2012. Moreover, NHTSA and FTC have met, as recently as last year, to discuss Vehicle-to-Vehicle (“V2V”) technology, and to coordinate their efforts. He supported a collaborative approach with industry on vehicle technologies, pointing to the Automatic Emergency Braking Systems (“AEB”) model as one technology where industry successfully cooperated with NHTSA to successfully create and institute industry best practices.

Maneesha Mithal from the Division of Privacy and Identity Protection at the FTC’s Bureau of Consumer Protection identified three problems with the draft proposal. First, there is no FTC enforcement authority for manufacturers who do not follow adopted best practices or engage in deceptive practices. The draft provides manufacturers with a safe harbor, placing manufacturers outside of the FTC’s authority once they have adopted best practices. Ms. Mithal believes this safe harbor is too broad and provided an example. Because the privacy requirements of the draft only apply to vehicle data collected from owners, renters or lessees, a manufacturer could misrepresent how they are collecting consumer data on their website and the FTC could not bring action. Second, the draft contains a provision denying researchers the ability to identify vulnerabilities in vehicle systems, i.e., the draft de-incentivizes research and other positive uses of data, imposing penalties of up to $100,000 for hacking. Third, she suggested that allowing car manufacturers to independently develop best practices might guarantee weaker regulations.

Elaborating on the third point, Ms. Mithal stated that though the draft identifies eight areas to be considered in the development of best practices, these are optional. Further, there is no requirement to update best practices as new technologies enter the market and NHTSA is given too little discretion. In her concluding remarks, Ms. Mithal stated the staff draft provides substantial liability protection to car manufacturers in an exchange for weak practices developed by an industry majority.

After the panel of regulators, representatives from the two major automotive trade associations provided testimony. Several key points were made on the subject of privacy and cybersecurity.

Mitch Bainwol, President and CEO of the Alliance of Automobile Manufacturers, emphasized the great investments being made by manufacturers in the development and adoption of safety measures. Mr. Bainwol cited a statistic estimating smart vehicle investments by manufacturers of new vehicles at $175 million per year. Additionally, Mr. Bainwol pointed to a claim by NHTSA that developments in technologies that reduce driver error could prevent up to 80% of driver error related accidents.

John Bozzella, President and CEO of Global Automakers, began his testimony by listing the issues that presently command his attention, the adoption of connected cars in particular. He emphasized that (i) the benefits of connected car technologies significantly outweigh the challenges; (ii) automakers regularly engage with privacy advocates; and (iii) there is a conscious effort in the industry to stay ahead of privacy and cybersecurity concerns. Speaking to the last point, he spoke of the auto industry’s Intelligence Sharing and Analysis Center (“ISAC”) and the positive effect it will have on many of these concerns.

The Future of Privacy Forum continues to monitor legislative efforts for privacy in connected cars and related technologies. We look forward to working with automakers on how to protect privacy and ensure the adoption of safe and innovative vehicles technologies.

 

Hector DeJesus, Legal Extern, Future of Privacy Forum

Big Presidential Campaigns Raise Big Privacy Issues

As the election season gets into full swing, campaigns are eager to collect and use information about voters everywhere. A recent study by the Online Trust Alliance found major failings’ with the campaigns’ privacy policies, and beyond the nuts and bolts of having an online privacy notice, political hunger for data presents very real challenges for voters and perhaps more provocatively, for democracy. FPF’s own Evan Selinger, Joseph Jerome, and Elliott Murray discusses some of the privacy challenges that campaigns may face in piece for the Christian Science Monitor’s Passcode.