White House Steps Up APEC-EU Interoperability Push
Former FPF’er Josh Harris provides some insights on the APEC-EU privacy interoperability project. “…the White House released a fact sheet detailing the outcomes of this year’s APEC meetings and highlighting the interoperability project as a key outcome that should be continued and expanded.” Check out the full article here and if you want to know even more, Hilary Wandall, Chief Privacy Officer at Merck and Melinda Claybaugh, Counsel for International Consumer Protection at the Federal Trade Commission will be joining Josh Harris to talk about this topic in a webinar on December 9th.
What Privacy Papers Should Policymakers be Reading in 2016?
Each year, FPF invites privacy scholars and authors to submit articles and papers to be considered by members of our Advisory Board, with an aim toward showcasing those articles that should inform any conversation about privacy among policymakers in Congress, as well as at the Federal Trade Commission and in other government agencies. For our sixth annual Privacy Papers for Policymakers, we received submissions on topics ranging from mobile app privacy, to location tracking, to drone policy.
Our Advisory Board selected papers that describe the challenges and best practices of designing privacy notices, ways to minimize the risks of re-identification of data by focusing on process-based data release policy and taking a precautionary approach to data release, the relationship between privacy and markets, and bringing the concept of trust more strongly into privacy principles.
Our top privacy papers for 2015 are, in alphabetical order:
These papers illuminate concerns that will continue to drive privacy debates in 2016. We look forward to celebrating the formal release of FPF’s Privacy Papers for Policymakers digest at an event with the authors the evening of January 13th, 2016. Save the date–more details to come!
We also want to thank Microsoft, EY, AT&T, and TUNE for their special support of this project. And we thank the scholars, advocates, and Advisory Board members that are engaged with us to explore the future of privacy.
Panelists Debated Materiality and Privacy Harms under the FTC’s Section 5
On November 5, the Future of Privacy Forum and Washington & Lee University School of Law co-hosted a panel on the Future of Section 5 of the FTC Act. The Federal Trade Commission Act permits the agency to bring civil enforcement actions under Section 5 against companies who engage in “unfair or deceptive trade practices.” Our panel of esteemed academics and professionals included David Vladeck (Professor of Law, Georgetown University Law Center), James Cooper (Director of Research and Policy, George Mason University School of Law, Law and Economics Center), Joshua Fairfield (Professor of Law, Washington & Lee University School of Law), and Margaret Hu (Assistant Professor of Law, Washington & Lee University School of Law). The panelists engaged in an hour of lively discussion about the nature of recent FTC rulings under this authority, and expectations for companies in the future.
Much debated among the panelists was the issue of materiality, or the requirement under the FTC Act that unfair trade practices be “material” to consumers before the FTC can bring an enforcement action. Only days after the Supreme Court heard oral arguments in Spokeo, Inc. v. Robins—a case about whether a man was harmed by having false information published about him online—this topic of privacy-related harm was on everyone’s mind. The Schrems Safe Harbor case from the European Court of Justice was also mentioned as it might relate to determining harm in the future.
The panelists diverged in their reactions to the FTC’s recent enforcement action against Nomi Technologies, a consumer analytics company that provided retailers with the technology to track in-store consumers by collecting their cell phone MAC addresses. At issue was the fact that Nomi’s privacy policy promised consumers the ability to opt out of the tracking—either online or in-store—but did not provide the in-store option. While Professor Vladeck called the case a “classic right to lie,” James Cooper called for empirical economic studies to determine when and how consumers are harmed. The key questions—how can something we all agree no one reads (a privacy policy) influence consumer behavior, and does that matter in terms of enforcement of companies’ public statements?—didn’t have an easy answer.
On the subject of types of cases the FTC brings and will continue to bring, Professor Vladeck noted that the FTC brings cases to make a point, and to keep the marketplace free of deception and unfair practices for the sake of both consumers and businesses. In the winning analogy of the night, he stated: “The FTC’s principal role is to be the gym teacher at the prom.”
Following the panel, the Future of Privacy Forum was delighted to host an Open House Reception to welcome everyone to its new offices, and to celebrate its new partnership with Washington & Lee University School of Law. Thank you to everyone who joined, and we hope to see you again soon!
Panelists (left to right): Professor Margaret Hu, James Cooper, Professor David Vladeck, and Professor Joshua Fairfield.
Left to right: Jules Polonetsky (Executive Director and Co-chair, Future of Privacy Forum), Dean Brant Hellwig (Washington & Lee School of Law), and Christopher Wolf (Founder and Co-chair, Future of Privacy Forum)
FTC's Cross Device Workshop to be held on Monday, Nov. 16th
On Monday, the FTC will be holding a workshop on cross-device tracking: how and why the advertising and marketing industries are using emerging technologies to track individual users across platforms and devices.
In the first decades of the Internet, the predominant method of state management–the ability to remember a unique user over time–was the cookie. However, because of how cookies operate, via the web browser placing a data file onto a user’s hard drive, this model is becoming increasingly ineffective at tracking user behavior across different browsers and devices. The fact that modern users are now accessing online content and resources through a broadening spectrum of devices–e.g. laptop, smartphone, tablet, watch, wearable fitness tracker, television, and other internet-connected home appliances–is creating a real challenge for advertisers and marketers who seek to holistically analyze consumer behavior. In this report, we explain the challenges and some of the emerging technological solutions, each of which presents nuanced differences in privacy benefits and concerns.
SCHOOL VENDORS LEGALLY COMMIT TO USE STUDENT DATA ONLY FOR APPROVED EDUCATIONAL USES
WASHINGTON, D.C. – Thursday, November 12, 2015 – The Future of Privacy Forum (FPF) and Software & Information Industry Association (SIIA) today announced that the Student Privacy Pledge, endorsed by President Obama, the National PTA and the National School Boards Association, now has the support of 200 companies serving millions of students.
The legally binding commitments in the Pledge can be enforced by the Federal Trade Commission and State Attorneys General. All participating companies and organizations are listed online at www.studentprivacypledge.org.
The Pledge is a list of 12 commitments school service providers have made to affirm K-12 student data is maintained in a secure, private and responsible framework.
The Pledge was developed by the FPF and SIIA in October 2014 with guidance from school service providers, educator organizations, and parent groups following collaboration with U.S. Representatives Jared Polis (CO) and Luke Messer (IN).
“Companies that serve students understand that they must maintain the trust of parents, students and teachers,” said Jules Polonetsky, Executive Director, FPF. “Although many states are passing new laws to govern student privacy, the Pledge plays a key role in setting a national standard for protecting student data and ensures companies are aware of the central restrictions in statutes such as FERPA and COPPA.”
“This milestone for the student privacy pledge further demonstrates the industry’s strong commitment to protecting student data privacy. These best practices safeguarding student information are a staple of the industry and provide a legally enforceable commitment to students, parents and schools,” said Brendan Desetti, Director, Education Policy, SIIA.
In addition to the Pledge campaign, the FPF has run a series of student privacy ‘bootcamp’ training sessions for ed-tech, hosted the first-ever National Student Privacy Symposium, and issued a privacy guide for parents in partnership with the National PTA.
The Future of Privacy Forum (FPF) is a Washington, D.C.-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. For more information, visit www.fpf.org
About SIIA
SIIA is the leading association representing the software and digital content industries. The Education Technology Industry Network (ETIN) of SIIA serves and represents more than 200 of SIIA’s 800 member companies worldwide that provide educational software applications, digital content, online learning services and related technologies across the K-20 sector. SIIA-ETIN shapes and supports the industry by providing leadership, advocacy, government relations, corporate education, intellectual property protection, business development opportunities and critical market information.
Last week, I was fortunate enough to see several cool new applications of location technology and social data at two conferences which bookended my week. Privacy issues were addressed at the end of each conference, which I understand: a lecture about privacy is the last thing entrepreneurs and researchers want to hear. Unfortunately, privacy can be something of a bummer — just today, Forrester Research made headlines with a report predicting a privacy “crackdown” in 2016.
Privacy has an image problem. It doesn’t help that while some specific privacy laws and regulations are quite clear, “privacy” as a concept remains amorphous and ill-defined. Rather than an ideology or a value, privacy is a tool in pursuit of those things. For many years, privacy has been defined along the lines of offering individuals control over their information. The 2012 White House Consumer Privacy Bill of Rights places a principle of individual control front and center, before any other consumer right, declaring that “[c]onsumers have a right to exercise control over what personal data companies collect from them and how they use it.”
Yet, if privacy is about control, what does it mean when people feel completely out of control when it comes to their digital footprint? Last fall, a Pew survey found that 91% of Americans believe they “have lost control over how personal information is collected and used by companies.” Our current conceptions of privacy as control doesn’t really work anymore. Provocateurs at these conferences noted that modern society has made it impossible to survive, let alone thrive without a smartphone. Going off the grid can, by itself, be suspicious. Having a LinkedIn profile can be a professional requisite.
At the Future of Privacy Forum, we have long called for efforts to “featurize” privacy. After spending a week seeing the many ways innovators could deploy data, we need to embrace creative, out-the-box ways to get consumers to think about how they can use and take advantage of their data online. Advances in web design and, more recently, app development have made everything from tracking personal finances to reading the text-heavy Harvard Law Review more enjoyable. There’s no reason design and functionality can’t also be used to make privacy more engaging.
Even small tweaks go far. Facebook, for example, recently featured a blue privacy dinosaur to help its users with a “privacy check-up.” More than 86% of Facebook users seeing the tool actually completed the entire privacy check-up, and Facebook suggested that the dinosaur “helped make the experience a little more approachable and a little more engaging.” Presenting users with a privacy check-up is easier than asking them to wade through a myriad of privacy settings of their own volition. Putting these simple tools right in front of user eyeballs not only makes privacy more approachable, but perhaps more salient.
Getting privacy right will only become more important. Increasingly, a trust deficit already exists when it comes to innovative uses of information. According to a presentation by Susan Etlinger from Altimeter Research, 45% of American consumers report having little or not trust in how organizations use their data. Getting users engaged with their data is only half the privacy battle, however.
Beyond featurization or “appifying” privacy in ways that gives users more control and ownership over their information, much more work needs to be done to provide more transparency and to build accountability measures around data use. As Steve Hegenderfer at Bluetooth described it, companies need to offer both good product and good policy. This Friday, I’ll be discussing my thoughts on what this looks like — and what the future of privacy holds in general at Privacy & Access 20/20 Conference in Vancouver.
A Way Forward for Social Media Research
Few would deny that technology and social media are changing the way we interact. People today can stay in touch with friends on Facebook, share vacation photos on Instagram, follow trends on Twitter, grow their networks on LinkedIn, and explore communities on Reddit. And people are staying connected wherever they go. The Pew Research Center recently reported that the percentage of U.S. adults who own smartphones has nearly doubled from four years ago to 68%. For U.S. adults between the ages 18 and 49 the figure is over 80%.
In her new book, Reclaiming Conversation: the Power of Talk in the Digital Age, MIT professor Sherry Turkle examines how technology is eroding our ability to meaningfully connect with one another. She argues that technology’s presence disrupts our conversations in subtle ways, and in the process we lose out on opportunities for deeper human interaction. She cites experimental research showing that the mere presence of a smartphone on a table when people are talking decreases their feelings of empathy and the amount of personal information they share. Professor Turkle does not argue that we boycott our devices entirely—she describes herself as pro-technology. Instead, she suggests that we make space for genuine conversation, uninterrupted by our devices.
Professor Turkle is not alone in warning of technology’s impact on human interaction. Studies have examined the possible link between the use of Facebook or other social media and envy or depression. But others argue that concerns are exaggerated. Madeleine George and Candice Odgers of Duke University looked at the effects of social media on teens and found little evidence of a negative impact—other than a disruptive effect on sleeping habits. Clearly, researchers have yet to reach a consensus on the effects of social media. If we hope to realize the benefits of digital connectivity while avoiding its potential harms, more research will be necessary. This research presents its own concerns, however. Many will remember the debate about Facebook’s collaboration with researchers to study the effects of showing users more sad stories from their connections.
But Facebook is not alone, as companies increasingly conduct data-driven research and consumer testing to tailor products and practices in the Internet’s crowded landscape. Such research could help us understand the impact of social media on its users and on society. At the same time, social media research often involves intimate details about people’s lives. Because companies need not follow federal standards governing human subjects experiments for internal research, this research has little ethical oversight and is rarely published. Thus, the ethical review that studies undergo is not clear and the findings they produce do not contribute to public knowledge.
To prevent unethical data research or experimentation, experts have proposed a range of solutions, including the creation of “consumer subject review boards,” formal privacy review boards, private IRBs, and other ethical processes implemented by individual companies. Organizations and researchers are increasingly encouraged to pursue internal or external review mechanisms to vet, approve and monitor data experimentation and research. However, many questions remain concerning the desirable structure of such review bodies as well as the content of ethical frameworks governing data use. In addition, considerable debate lingers around the proper role of consent in data research and analysis, particularly in an online context; and it is unclear how to apply basic principles of fairness to selective populations that are subject to research.
To address these challenges, the Future of Privacy Forum is hosting an invitation-only academic workshop supported by the National Science Foundation and the Alfred P. Sloan Foundation on Dec. 10, 2015, which will discuss ethical, legal, and technical guidance for organizations conducting research on personal information. Leading papers will also be selected for presentation at the workshop and for publication in the online edition of the Washington and Lee Law Review.
Sherry Turkle writes that we have “become accustomed to seeing life as something we can pause in order to document it, get another thread running in it, or hook it up to another feed.” Perhaps, with a way forward for meaningful research on social media and digital connectivity, we can decide whether the world Professor Turkle describes should be fought, navigated, or embraced.