Future of Privacy Forum Releases New Survey on Privacy and Trust Issues in the "Sharing Economy"

FUTURE OF PRIVACY FORUM RELEASES NEW SURVEY ON PRIVACY AND TRUST ISSUES IN THE “SHARING ECONOMY”

Whitepaper Examines Benefits and Challenges of Reputation Management in Peer-to-Peer Services and Provides an Overview of Market Leaders in Key Sharing Economy Sectors

WASHINGTON, D.C. – Monday, June 8, 2015 – As peer-to-peer services comprising the “Sharing Economy” continue to gain wide acceptance with U.S. consumers, the Future of Privacy Forum (FPF) today released a timely whitepaper that focuses on the reputational, trust and privacy challenges users and providers face concerning the management and accuracy of shared information.

Released in advance of a June 9 workshop focused on the sharing economy, the FPF paper titled “User Reputation: Building Trust and Addressing Privacy Issues in the Sharing Economy” – comes at a time when the sharing economy, especially in the hospitality and transportation sectors, is expanding in popularity and growth at breakneck speed. The total value of global sharing economy transactions was estimated at $26 billion in 2013, and is estimated to generate as much as $110 billion in coming years.

At the same time, consumers are recognizing the benefits of shared services: a recent study notes 86 percent of adults in the U.S. believe such services make life more affordable, while 83 percent believe they make life more convenient and efficient.

Sharing economy services – such as Uber, Airbnb, Etsy, and TaskRabbit, among others – rely heavily on online and mobile platforms for transactions and the peer-to-peer sharing of critical, ‘reputational’ information. This includes data regarding recommendations, ratings, profile access, review challenges, account deletion, and more. How access to and control of this data is managed by sharing economy brands and services is essential to building user trust, and has important privacy implications as well.

“Uber’s new option that provides riders with access to their ratings is an important step forward,” said Jules Polonetsky, FPF’s Executive Director. “If consumer access to services is dependent on ratings and reviews, consumers need transparency into their scores and into how these systems work”

The FPF survey provides an overview of how reputation-building and trust are frequently essential assets to a successful peer-to-peer exchange, and how ratings, peer reviews, and user comments serve as core functions of such services. It examines the commonly used mechanisms to build reputation, as well as issues surrounding identity and anonymity, and the role of social network integration.

The highlight of the group’s study is a section entitled, “Maintaining Reputation: Privacy Challenges of Rating Systems.” How sharing economy and peer-to-peer platforms are implementing Fair Information Practices concerning user-generated data, especially access and correction capabilities for users and providers, has tangible privacy implications.

As a result, the FPF paper undertook a survey of a number of market leaders in the sharing economy sectors of transportation (Lyft, Sidecar, Uber), hospitality (Airbnb, HomeAway, Couchsurfing), retail goods (Etsy, NeighborGoods, eBay) and general services (TaskRabbit, Instacart, Handy) to review how these platforms implement access and correction capabilities. Brands were surveyed to see how they implement access rights, correction and response mechanisms, and whether they provide clear guidance for deleting account information.

The report concludes with a call to action for many companies in the sharing economy marketplace, encouraging them to strive to provide more guidance to users about reputation and to be more transparent about access and control over information. Such moves will not only amplify consumer trust, but also help ensure fair treatment of consumers.

This is especially important for the future growth of the sharing economy sector, as the report notes:

“While platforms need to have good and reliable reputational systems in place in order to create trust between users, they will also have to ensure their users trust them. It is very likely that…users will rely on the platform’s reputation, in addition to user reputation alone.”

The survey was authored by FPF staffers Joseph Jerome, Benedicte Dambrine, and Ben Ambrose.

About Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. For more information, visit fpf.org

Media Contact

Nicholas Graham, for Future of Privacy Forum

[email protected]

571-291-2967

Balancing Free Expression and Social Media Monitoring

Last week, central Florida’s largest school district announced that it would begin monitoring a number of social media sites for posts “that may impact students and staff.” As more and more school districts look to social media to monitor and track students, it raises big privacy questions. Certainly, many schools have reacted to school shootings, student suicides, and bullying concerns by connecting with social-media-monitoring companies to help them identify problems for which school personnel, parents, or even law enforcement may need to take action. In fact, when tragedies have taken place, the first reaction has often been to scour social media to see whether there were clues that should have led to action or intervention.

Parents appear to have largely accepted this general practice, but the limits of school’s tracking and monitoring their students remain unclear. As Jules and I explored in an op-ed for Education Week last month, we don’t yet know how to strike the right balance between monitoring and tracking—while allowing individuals to vent, blow off steam, and otherwise freely express themselves online without feeling surveilled.

While this story deals largely with monitoring students, the public at large has contradictory and conflicting views about social listening. According to a 2013 Netbase study51% of consumers want to be able to talk about companies without them listening, but 58% want them to respond to complaints and 64% want companies to respond when spoken to. To avoid a notorious creepy label, schools — and indeed any organization — ought to be open and transparent about why they’re listening and what they’re listening for. The public’s views of social listening can be contradictory and confusing.

We hope to explore this issue further, and welcomes any thoughts and feedback from anyone out there . . . listening.

-Joseph Jerome, Policy Counsel

Talking Cars and the Internet of Things at TRUSTe's IoT Privacy Summit

Future of Privacy Forum is excited to partner with TRUSTe to provide attendees with a full day of case studies, workshops and panels at the second IoT Privacy Summit on June 17th in Menlo Park, California. This year’s Summit focuses on practical solutions to the privacy challenges brought on by the Internet of Things, with topics focusing on key FPF priorities like connected cars, smart cities and homes, wearable devices, and more.

FPF’s Joseph Jerome will participate in a panel titled “How the Automobile Industry Took the Lead in Industry Self-Regulation,” along with representatives from General Motors and Hogan Lovells. The panel will discuss how car makers came together to address privacy issues head-on as vehicles become increasingly connected — and data fueled. The group will also discuss how a set of automotive privacy principles were developed, and what industry is doing to implement them ahead of their 2016 start-date. Click here to view a current list of other speakers.

Ahead of the Summit, on June 16th, FPF will also participate in the IoT Privacy Tech Working Group. The group will meet to identify both the technical standards and best practices necessary to help enhance consumer privacy in the IoT. More information about the IoT Privacy Summit 2015 is  available here.

More important, to register, click here. We look forward to discussing the Internet of Things next month!

Communicating with Parents about Student Privacy

Recently, a number of bills have been proposed at both federal and state levels aimed at protecting student data privacy. Laws, codes of conduct, better contracts and training all play key roles in ensuring that student data will be used responsibly.  However, the most important effort that has yet to be well addressed is the communication between schools and parents.

At FERPA|SHERPA, we highlight some school districts that have succeeded in offering parents a clear description of the technologies they use and the data collected.  We were pleased to recently come across another great example of parent communication at the Smithfield Public School District in Smithfield, RI.  Smithfield provides an informational webpage that informs parents about the applications used in the district, the purposes for which they are being used and what information they collect. The webpage also provides links where parents can see the privacy policy, terms and conditions and other relevant information about the educational applications.

Kudos as well to Smithfield for designating two leading school officials, Paul Barrette, the school department’s director of technology, and Craig Levis, special education director, as the privacy leads for the district.  It is essential that schools appoint privacy officers and institute appropriate training if we expect compliance with laws and policies.

Please let us know of any other schools that have provided helpful communications for parents, so that we can share great examples that the school community can learn from.

 

Pew Tackles the Future of Privacy

On Wednesday, the Pew Research Center released its third report on Americans’ attitudes towards privacy and surveillance. While the report confirms previous findings that, no, privacy is not dead, it focuses a broader look at Americans’ views on privacy in public and information control. It finds that our privacy-values are particularly heightened with respect “to having a sense of control over who collects information and when and where activities can be observed.”

Nearly all adults report that who is gathering information and what information are an essential dimension of privacy control. Strong majorities believe — 74% believe “very strongly” — that it is important to be in control of who can get information about you. The home continues to be viewed as “do not disturb” zones, which may present interesting implications for the emerging Internet of Things. And by a 2-to-1 margin, Americans believe in limits on employer-monitoring of employees.

One particularly interesting finding from the report are the Americans’ views toward data retention broadly. Most Americans believe that only “a few months” or less is long enough for companies to store most records of their activities. Different industry sectors get more or less leeway. For example, majorities support credit card companies retaining their data, but even here, the length of time people feel are reasonable retention periods varies. Once again, strong majorities were skeptical of the need for online advertisers to “safe any info” about them for lengthy periods of time, if at all.

The Future of Privacy Forum’s Capitol-Area Academic Network was privileged enough to discuss the Pew privacy project with the report’s authors last fall, and Pew’s series continues to demonstrate not only the value of privacy — but the strong need to think about better ways to offer privacy controls and communicate practices with consumers.

Joseph Jerome, Policy Counsel

NYC Taxi & Limousine Commission Proposal Raises Privacy Concerns for Apps

On Monday, the Future of Privacy Forum joined with the Bill of Rights Defense Committee/Defending Dissent Foundation, Center for Democracy & Technology, The Constitution Project, and Electronic Frontier Foundation to write the NYC Taxi and Limousine Commission (TLC) about its proposed rules regarding For-Hire Vehicle dispatch apps.

We were especially concerned with the requirement that apps be automatically capable of “collecting and transmitting” a wide array of data including the requested pick-up time, date, and location, which could be collected even in the event that the passenger later cancelled the trip. The proposed rules provide no guidance with regard to when and how such transmission would occur, suggesting this data could be requested at the sole discretion of TLC.

This sort of broad data collection by a government agency presents important privacy issues. In particular, it raises key Fourth Amendment concerns, as well as permits wide swaths of sensitive data to potentially be released publicly through state Freedom of Information laws. Several news reports have previously demonstrated how even allegedly anonymized taxicab data can be “reverse engineered” to reveal passenger names and trip pick up and drop location information.

Everyone understands the TLC’s need to regulate FHVs and that mobile apps are increasingly the mechanisms that govern these services. FPF in particular has been a strong proponent of smart city initiatives, and using trip data to optimize traffic flows, improve the environment, and advance safety.

Nonetheless, we urge the TLC to seriously consider the privacy challenges posed by its proposal. Our letter encourages the Commission to engage in a more in-depth consultative process with privacy experts, organizations and the public in order to determine how to achieve TLC’s goals to guide FHV apps without unnecessarily placing passengers’ privacy at risk. The full letter is available to read here.

A Historical Primer on Section 215 Bulk Collection

Over on the IAPP’s Privacy Tracker blog, FPF Senior Fellow explains how the past week has seen two significant events concerning Section 215 of the USA PATRIOT Act. First, on May 7, the Second Circuit ruled that “the telephone metadata program exceeds the scope of what Congress has authorized and therefore violates” Section 215. And yesterday, the House of Representatives approved the USA FREEDOM Act by 338-88, which could limit by statute collection of domestic telephone metadata and other records under Section 215. According to Swire, this week’s activities will have potential important effects in the long term on surveillance policy.

Parents Rights To Student Data Privacy

FUTURE OF PRIVACY FORUM INTRODUCES DIGITAL GUIDE TO EQUIP PARENTS WITH KNOWLEDGE, UNDERSTANDING OF LAWS GOVERNING STUDENT DATA USE AND PRIVACY

New Website Developed in Partnership with National PTA, ConnectSafely.org

 

WASHINGTON, D.C. – Monday, April 27, 2015 – As the digital revolution continues to transform students’ learning, how teachers instruct in classrooms, and the way schools gather pupil data and information to improve education, the Future of Privacy Forum (FPF) today unveiled a new, timely web resource to provide clear and straightforward overviews of parents’ rights to student data under the various relevant federal laws in effect.

 

“A Parent’s Guide to Student Data Privacy Rights”, developed and published in partnership with the National PTA and ConnectSafely.org, is a valuable tool for parents seeking answers and guidance related to major federal laws on education privacy, such as the Family Educational Rights and Privacy Act (FERPA) and the Children’s Online Privacy Protection Act (COPPA), as well as other laws and policies. It is also designed to help parents better communicate on these topics with teachers, schools and school districts.

 

“As data and technology use expands to improve educational outcomes, the focus needs to stay on parents and students, and it is essential for parents to understand how their children’s information is being handled and used,” said Jules Polonetsky, Executive Director, FPF. “We developed this guide with the National PTA and ConnectSafely to help parents understand the laws that protect student data and their rights under these laws.  We live in an increasingly connected world, but whether the information is online or on paper, the basic privacy and access rights remain the same.”

 

The free, digital guide also includes a list of additional resources on topics related to student privacy from organizations such as:

 

 

Some of the common questions that are answered in the guide include:

 

 

“Technology and the Internet are powerful tools for teaching and learning, but at the same time, it is imperative that students’ academic and personal information is protected,” said Otha Thornton, Jr., President, National PTA. “It is a top priority of National PTA to safeguard children’s data and make certain that parents have appropriate notification and consent as to what and how data is collected and used. National PTA is pleased to collaborate with the Future of Privacy Forum and ConnectSafely.org to bring the Parents’ Guide to Student Data Privacy to families nationwide to ensure they are knowledgeable about the laws that protect student data as well as students’ and parents’ rights under the laws.”

 

“As schools increase the use of technology and student data, parents have privacy concerns regarding their children’s information,” said Olga Garcia-Kaplan, parent and advocate for student data privacy. “This guide provides a clear and concise explanation for parents to understand their rights and laws that are in place to protect their privacy.”

 

The guide is part of FPF’s “FERPA|SHERPA” website, which provides service providers, parents, school officials, and policy makers easy access to laws, best practices, and guidelines that are essential to understanding privacy issues in education and how to responsibly use student data.

 

About Future of Privacy Forum

The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leading figures from industry, academia, law and advocacy groups. For more information, visit fpf.org

 

Media Contact

Nicholas Graham, for Future of Privacy Forum

[email protected]

571-291-2967

Rise of the Drones

This morning, the Center for Strategic and International Studies presented a panel conversation on some of the challenges – and opportunities – around domestic drone use. After following the issue for years, it would appear that drone policy’s day has finally arrived. According to the FAA, nearly 4,500 comments were submitted in response to the agency’s proposed rulemaking for drones, or unmanned aircraft systems (UAS), and the NTIA received over 50 comments specifically on privacy issues around drones.

While the Future of Privacy Forum continues to think about how best to address these issues, there is little question that domestic and commercial drone use offer tremendous societal benefits. At the panel, Brian Wynne, President of AUVSI, a leading robotics trade association, explained that drones will create over $83 billion in economic activity in their first decade, and promise to generate tens of thousands of jobs. Wynne suggested that it is “almost impossible to anticipate all the different ways we can utilize UAS moving forward.”

Even the ACLU’s Jay Stanley, who remains concerned about law enforcement’s eagerness to use drone technology, admitted that drones could be a “generative technology” in the private sector.  He noted that privacy issues on the commercial side are incredibly complicated, not only implicating the First Amendment but perhaps lacking the sort of privacy-invasive incentives that could exist in law enforcement. Indeed, his “nightmare scenario” is a world where drones could be used for persistent surveillance, while members of the public, including journalists and entrepreneurs, will be hamstrung in their ability to use UAS technologies.

Stanley applauded the “outpouring” of interest on privacy with drones. He echoed notions that drones are more salient among the public. “It’s not hard to see the privacy issues with a drone with a camera on it,” he said. “Things like big data are more abstract.”

Adam Cox, from CSIS and an advisor to DHS’ Advanced Research Projects Agency (HSARPA), suggested that privacy and drones presents a “technically sexy problem.” “A lot of people are interested [in the technology],” he explained, and because drones allow everyone to engage in flight, “people are going to want to put new things on this.” He encouraged technologists to work hand-in-hand with policymakers, recommending both geofencing solutions and education efforts toward both manufacturers and operators of drones.

On that front, the Future of Privacy Forum is eager to engage. Last week, we filed comments with the NTIA on our thoughts on privacy and domestic drone use. A wide variety of individuals and organizations also submitted comments ahead of a new drone privacy multistakeholder effort, and it is clear that there are a number of ideas in play for how to address data collection and use by drones, as well as the address public concern about a loss of privacy from above.

In addition to several procedural recommendations, our comments focus on the value of transparency and training to address privacy concerns. We recognize that drones present different types of transparency challenges, both in terms of general practice and then individual drone flight. In both instances, we support conversations about what sort of information could be communicated to consumers in a way that does not place significant burdens onto individual UAS operators. Further, while drone operation will require at least some degree of safety training, we are hopeful some type of privacy training can be incorporated into that.

As today’s panel and all these comments suggest, there is much work to be done to figure out how general privacy principles can be applied to a diverse array of UAS technologies. While we support a technology neutral approach, it is clear that consumers, businesses, and policymakers all need to have a voice in determining how commercial drones can and should take flight.

-Joseph Jerome, Policy Counsel

FPF Senior Fellow Peter Swire Provide Comments to the FCC on Broadband Consumer Privacy

Later today, Peter Swire, FPF Senior Fellow, will participate at the FCC’s public workshop on broadband consumer privacy. He also prepared written comments expanding on his thoughts. Professor Swire summarizes his research as follows:

First, I examine the effect of the Section 222(a) definition of “proprietary information” as compared with the Section 222(c) definition of “customer proprietary network information.” (CPNI) My conclusion, based on some analogous provisions from HIPAA and GLBA, is that the Commission should be cautious about founding any additional regulatory requirements under this proceeding based on the language in 222(a).

Second, I examine the intersection of privacy and competition law, drawing on my previous writings in the area. New entry into online advertising, including by broadband providers, could be a new source of competition on privacy attributes. My recommendation to the Commission is to consider the effects of this potential competition on privacy and other non-price aspects of competition, along with price aspects of competition, as part of the overall assessment of how to govern the use of CPNI for broadband providers.

Third, I address priority uses of information that I believe should be permitted in the CPNI context. Although I do not seek to create a complete list of possible exceptions to the general CPNI rule of consumer opt-out, I do emphasize three areas where an opt-out is not generally appropriate – anti-fraud, cybersecurity, and research on network usage. I also analyze the role of de-identification and aggregate information under Section 222, suggesting strategies to preserve the utility of de-identified and aggregate information while protecting privacy. In this discussion, I do not take a position on whether a rules-based, principles-based, or other approach should be adopted by the Commission. Instead, I emphasize that important interests such as anti-fraud and cybersecurity should be taken into careful consideration in whatever approach the Commission pursues.

He concludes that translating Section 222 privacy protections to the broadband sector is far from a simple task, noting the “considerable technical and market differences from the telephone market governed by the 1996 CPNI rules.”

Professor Swire’s full written comments are available here.