Moving the Internet of Things Forward Without Hard Numbers on Risks

Today’s release of the FTC’s long-awaited report on the Internet of Things concludes that connected devices are “likely to meld the virtual and physical worlds together in ways that are currently difficult to comprehend.” It’s this great unknown where it seems many of the revolutionary benefits and more abstract risks from connectivity lie. While the Future of Privacy Forum was largely supportive of the report, the separate statement of Commissioner Ohlhausen and dissent from Commissioner Wright reflect the reality that it is still early days for the Internet of Things and the path forward is not crystal clear.

Both commissioners caution the FTC against focusing on speculative harms and urge a more rigorous cost-benefit analysis. For Commissioner Ohlhausen, this perspective dictates that the Commission approach new technologies with a degree of regulatory humility, while Commissioner Wright is more blunt: “Paying lip service to the obvious fact that the various best practices and proposals discussed in the Workshop Report might have both costs and benefits, without in fact performing such an analysis, does nothing to inform the recommendations made in the Workshop Report.”

There is some merit to this criticism. At the end of the day, I always want hard numbers to back up my policy positions and lamented the fact that I often don’t have them.  The problem is that when it comes to privacy – particularly privacy harms – evaluating costs is difficult. Academics have been struggling for years through efforts both serious and silly to quantify the value of privacy, and the Internet of Things appears to have only added to the perception that there are new, more abstract threats to privacy. Existing risk assessment frameworks are well geared to identify and address tangible harms, such as financial loss or security vulnerabilities, but it is much harder to measure the chilling effects, if any, of smart thermostats and televisions.

It is not sufficient to simply dismiss concerns as too abstract or inchoate. By now, it should be increasingly clear that the public’s sense that individuals have lost control over their information is problematic. It’s unclear whether the FTC alone is in the position to ensure privacy and security concerns about connectivity will not undermine public confidence in the wider Internet of Things. So what can be done?

In some spaces, the Internet of Things seems to finally open the door for companies to compete on privacy in a way that consumers can understand. The Association of Global Automakers, for example, view its recently released automotive privacy principles as a floor for its members and fully expects car companies to be able to compete on privacy, both due to our close relationships with our cars and preeminent security concerns. There are plenty of opportunities for companies to be proactive on privacy. There are already industry efforts underway to establish a baseline for how wearable devices should be treated and algorithms be governed. Recently, FPF has called for companies to engage in a more detailed and concrete analysis of the benefits of their data projects. Part of the aim of this effort is to encourage industry to develop more elaborate and more structured processes such as review panels and “consumer” IRBs that can consider seriously the ethical challenges posed by some innovative uses of data.

Beyond that, the Internet of Things promises new opportunities for users to engage with their information. Connectivity tools that offer users controls will be important not just for privacy but for basic functionality. I was pleased to see the FTC report highlight our calls for consumer profile management portals. Implemented correctly, better user dashboards will give individuals the ability to make their own decisions about what information about them is collected, used, and shared. These tools respect user autonomy and let them make their own decisions about what benefits they want – and risks they’re willing to tolerate. Better, more individualized control obviously doesn’t resolve collective privacy challenges, but it is one option to look at alongside codes of conduct, traditional benefit-risk assessments, and review mechanisms. Until we can find a better way to measure and analyze the societal value of privacy, this multi-pronged approach is the best way forward on the Internet of Things.

-Joseph Jerome, Policy Counsel

FPF Statement on FTC Internet of Things Report

The Report, which appropriately does not call for IoT specific legislation, reflects the fact that the Internet of Things is in its infancy, and strongly supports context as a way to assess appropriate uses.  The staff recognized concerns that a notice and choice approach could restrict unexpected new uses of data with potential societal benefits.  They sensibly  incorporated certain elements of the use-based model into its approach linking the idea of choices being keyed to context to take into account how the data will be used.

However, the report is overly cautious in that it recognizes that there are beneficial uses that can strain data minimization or could warrant out of context uses, but worries that allowing companies alone to judge the bounds of such uses without legislation would lead to mistakes.  In many cases, the FTC already has the ability to use deception or unfairness authority to take action when a company creates risk to consumers without countervailing benefit.  We hope the Administration’s soon to be released Consumer Bill of Rights charts options that can frame the parameters for out of context uses or data retention, by looking to codes of conduct and consumer subject review boards.

-Jules Polonetsky & Christopher Wolf, Future of Privacy Forum

Student Privacy Boot Camp for Ed Tech Companies

The Future of Privacy Forum – which recently spearheaded the White House-endorsed “Student Privacy Pledge” with SIIA – has now partnered with ReThink Education to present a new and timely Student Privacy Boot Camp” for ed tech companies.

The goal of the training program is to gather ed tech companies – startups, small- and medium-sized companies – for detailed legal and policy presentations to help them understand the regulatory requirements and industry best practices to properly handle student educational data in a complex and rapidly changing environment.

Presenters and panelists come from individual schools or districts, the FTC, the Department of Education, and a variety of education-focused advocacy and policy organizations.

The Boot Camp will take place February 17-18 in Washington, D.C. at 1776dc for an afternoon and a morning session, with a networking dinner the first evening.  Because attendance is limited, companies must apply for the opportunity to participate.  Companies who are selected and then commit to attend, will have their expenses covered for their participation.

The preliminary schedule and the application for companies desiring to attend are available here.  This schedule will continue to be updated as presenters are identified, and times are refined.  For additional information (other than to track your application), please contact Brenda Leong, Legal and Policy Fellow at Future of Privacy Forum, [email protected] or 202-792-8801.

Student Privacy Pledge Hits 90 Signers!

We are pleased to announce that as of today, additional signatories to the FPF/SIIA Student Privacy Pledge include Aegis Identity, Agency for Student Health Research, Avepoint, besimpler, CPSI Ltd., Google, Khan Academy, Kidhoo, makkajai, MMS, National Student Clearinghouse, Navvie, Ripple Effects, Student Lap Tracker, and Tools4Ever.

Beacons in Airports Provide Information for Travelers

Readers know we support  responsible beacon technology practices. Today’s story illustrates how airports can provide real-time updates about travel plans, accommodations, and flights to travelers by using beacons.

According to Luxury Daily, recent surveys show 53 to 77 percent of travelers in the United States would like for airports to send real-time updates on gate changes and flight times to their mobile phones. Companies like Swirl believe that because beacons can offer travelers a valuable and personalized the mobile experiences, travelers would be to share some level of personal information. However, this is an opt in service: a customer or traveler would first need to download a mobile app and opt-in to receive these beacon-triggered messages and content.

Some airports, like the Miami International Airport, have rolled out programs that use beacons to help users find the correct gate and send push notifications for restaurant and store deals when travelers are walking around. Airports and travel brands who are interested in implementing beacon technology must be willing to invest in some infrastructure or technological changes, and develop partnerships with retail brands and mobile payment solutions.

This opt-in use of beacons in airports is just one example of the many ways beacon technologies are growing this year, and providing value to mobile users in a privacy friendly manner.

President Obama Backs FPF-SIIA Student Privacy Pledge

 

 

The Pledge Is A Strong Means Of Protection For Student Personal Information

Supporters include Microsoft, Apple, Amplify, Houghton Mifflin Harcourt, Edmodo, Lifetouch, Knewton, Code.org, Shutterfly, Clever, eScholar, Class Dojo, DreamBox Learning

Washington, D.C. – Monday, January 12, 2015 – President Obama today strongly endorsed the Student Privacy Pledge, calling for more companies to make a firm commitment to using student data only for educational purposes.

“We developed the Pledge to provide a way for school service providers to clearly explain to parents, students and teachers how data is being used to support student education”, explained FPF Executive Director Jules Polonetsky. “And, in a gridlocked Congress where federal legislation faces challenges, the Pledge creates an immediate and enforceable legal code for companies that sign on. The Administration was instrumental in helping to get the word out, and its support early on was important to many companies being interested.”

In October 2014, The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) announced a K-12 school service providers Student Privacy Pledge to safeguard student privacy – outlining a dozen commitments regarding the collection, maintenance, and use of student personal information.

Seventy-five companies have now signed the Pledge to publicly declare their commitment to student privacy and to make a legal promise to follow the principles. The Pledge is effective as of January 1, 2015.

The Pledge commitments detail ongoing industry practices to ensure responsible, fair handling of student data. The Pledge applies to all student personal information whether or not it is part of an “educational record” as defined by federal law, and whether collected and controlled by a school or directly through student use of a mobile app or website assigned by their teacher. It also applies to school service providers whether or not they have a formal contract with the school.

Signers of the Student Privacy Pledge have committed to:

 

The Pledge was developed by the FPF and SIIA with guidance from the school service providers, educator organizations, and other stakeholders following a convening by U.S. Representatives Jared Polis (CO) and Luke Messer (IN). The Pledge has been endorsed by the National PTA and the National School Boards Association.

“Congressmembers Polis and Messer help kicked off the idea of a pledge and were critical to hammering out a privacy friendly set of rules that ensure data is protected and used to benefit student education,” said Jules Polonetsky, exec director of the Future of Privacy Forum

The initial leadership group of companies that launched the pledge in October included Amplify, Code.org, DreamBox Learning, Edmodo, Follett, Gaggle, Houghton Mifflin Harcourt, Knewton, Knovation, Lifetouch, Microsoft, MIND Research Institute, myON (a business unit of Capstone), and Think Through Math.

The full text of the Pledge and more information about how to support it, including a list of current signatories, are available at http://studentprivacypledge.org/.

 

A Practical Privacy Paradigm for Wearables

A Practical Privacy Paradigm for Wearables is available to read here.

wearableoutput

 * * * * * *

Only a week into 2015, and already it looks to be the year of wearable technologies. At this year’s International Consumer Electronics Show (CES), wearables and the Internet of Things have dominated the conversations and the exhibition halls. With 900 Internet of Things exhibitors at the conference, it’s clear that consumers will be offered many new ways to immerse themselves in connected life. More than just fitness bands and smartwatches, consumers will soon be reaching for “smart” tennis rackets, coffee makers, pacifiers, stovetops, and pet accessories.

However, as FTC Chairwoman Edith Ramirez reminded us all yesterday in a speech at the CES conference, the Internet of Things is a complex system with “the potential to provide enormous benefits for consumers,” but also “significant privacy and security implications.” The Chairwoman’s speech focused on three key privacy challenges arising from the IOT, as well as three key steps companies can take to enhance consumer trust and ensure that consumers will continue to adopt these new technologies. The three core privacy risks she highlighted were: “(1) ubiquitous data collection; (2) the potential for unexpected uses of consumer data that have adverse consequences; and (3) heightened security risks.” To help mitigate these risks, Ramirez believes that companies should: (1) adopt “security by design”; (2) engage in data minimization; and (3) increase transparency and provide consumers with notice and choice for unexpected data uses.

The Future of Privacy’s new paper, A Practical Privacy Paradigm for Wearables, addresses these same concerns. The paper examines how wearable technologies are challenging traditional applications of the Fair Information Privacy Principles (FIPPs) and why policymaking in this area requires a forward-thinking, flexible approach to these concerns. The FIPPs have long provided the foundation for consumer privacy protection in this country, and still embody core privacy values. However, a rigid application of them may not always be feasible in the fast-paced world of wearables and the nascent IOT. Both the technologies and social norms around these devices are developing quickly, and holding innovative new designs and data uses to privacy standards developed for other industries could stymie the next technological revolution.

We certainly agree with Chairwoman Ramirez that the IOT creates new and challenging privacy risks, and that traditional privacy principles like notice and choice and data minimization will play an important role in these spaces. However, we urge policymakers to take a nuanced approach to the application of these principles to wearable devices, particularly in these early days of their development. As the Chairwoman noted, it will take the “ingenuity, design acumen, and technical know-how” to provide consumers with useful notice and choice for their wearables. Wearables come with as many shapes and sizes as consumers, and one-size-fits-all solutions will not be feasible.

Another challenging issue our paper examines is the intersection of wearables and Big Data, as wearables’ capacity for granular ubiquitous data collection both opens the door to new and important health, efficiency, and personal benefits, but also to significant privacy risks. We agree with the Chairwoman’s caution against collecting and holding consumer information on the mere off-chance that it could become valuable someday. However, we also believe that novel data uses do sometimes develop from data collection that is based on speculative or “pure research” purposes and that allowing for these uses is essential. In order to unleash the benefits of big data, researchers and organizations need the opportunity to look for unanticipated insights in datasets like those that might be created by consumers utilizing their wearables to ubiquitously track their own activities.

Rather than immediately imposing restrictions on data collection, we believe organizations should engage in comprehensive risk-benefit analyses of both the potential risks and potential rewards of putting data to a particular use. FPF has previously published a methodology for this sort of serious assessment in our paper Benefit-Risk-Analysis for Big Data. By identifying and quantifying both the risks and benefits of handling data in a particular manner, companies can more rationally determine when a certain use is appropriate or when to scale back data collection. “Trust us” is not a sufficient rationale for careless handling of consumer data by companies, and comprehensive risk-benefit analysis prevents thoughtless decision-making. By engaging in case-by-case balancing, we can allow for novel data uses and big data breakthroughs only when and where the benefits to individuals and society outweigh their risks.

In addition to examining the need for common sense applications of the FIPPs, the paper presents a variety of industry solutions necessary to support such a framework. We wholeheartedly support the Commissioner’s call for security-by-design, recommending that “organizations be prepared to defend consumers’ personal data against both internal threats, such as curious employees, and external threats, such as hackers or scammers,” as well as her recommendation that organizations engage in practical de-identification practices. We also suggest companies respect the context of data collection, be transparent about how they use consumers’ personal information, provide reasonable individual access to data, and help develop binding codes of conduct for wearables.

2015 may be the year that wearables go mainstream, both for consumers and for privacy professionals and policymakers. While companies continue to develop new ways to connect our digital and physical worlds, there are many more discussions to be had about how these devices will fit into our lives. The wearables industry needs time to mature, and users need time to learn what they want and expect their wearables to do for them and with their personal information. Already, companies and platforms, such as Apple HealthKit and Google Fit, are developing baseline rules to protect consumers’ privacy. Moving forward, we urge policymakers to adopt a forward-thinking, common sense application of the FIPPs in the wearables space.

– Kelsey Finch