Future of Privacy Forum Awarded National Science Foundation Grant to Support Industry-Academic Collaboration on National Privacy Research Priorities
The Future of Privacy Forum (FPF) has received a $300,000, two-year grant from the National Science Foundation (NSF) to establish a Privacy Research and Data Responsibility Research Coordination Network (RCN). The RCN will produce a community of academic researchers and industry practitioners to support industry-academic cooperation to address research priorities identified in the Administration’s recently released National Privacy Research Strategy (NPRS). The NPRS identifies several priorities for privacy research including increasing transparency of data collection, sharing, use and retention; ensuring that information flows and use are consistent with privacy rules; and reducing privacy risks of analytical algorithms.
The NSF grant will provide FPF with the necessary support to launch and regularly convene the RCN, drawing on its relationships with industry chief privacy officers, academic researchers, regulators, lawyers, social scientists, civil rights advocates, social philanthropists, and government actors, to facilitate communication and sharing of information and ideas. The RCN will prompt industry action and advocate for privacy-aware approaches to collecting and processing personal information in a manner that respects individual privacy, equality and fairness.
a series of academic workshops and convening opportunities, incorporating both new and established academic programs; and
network communications, including the introduction of a Privacy Scholarship Reporter.
“The overarching goal of the National Privacy Research Strategy is to produce knowledge and technology that will enable individuals, commercial entities, and the government to benefit from transformative technological advancements, enhance opportunities for innovation, and provide meaningful protections for personal information and individual privacy,” said Jules Polonetsky, FPF’s CEO. “At a time when industry actors are the custodians of a wide range of consumer data, bringing together corporate, academic, and advocacy constituents is critical to making practical privacy progress.”
The RCN will inform the public debate on privacy, provide useful information to policymakers, and contribute to the development of systems and products used to help society realize the benefits of networked information technology without sacrificing privacy and individual rights.
For more information, and to learn how to become involved with the efforts of the RCN, click here or visit fpf.org/rcn.
First Take: Privacy in the Federal Automated Vehicles Policy
In the federal guidance for autonomous vehicles issued yesterday, the Department of Transportation and the National Highway Traffic Safety Administration have wisely recognized that privacy will play a key role in promoting trust in connected vehicles. This guidance and its emphasis on privacy is an important first step in building that trust.
The highly-anticipated federal guidance calls for companies to complete annual 15-point safety assessments, and outlines model state policies and regulatory tools that will enable adoption of autonomous vehicles on U.S. roads. As FPF has highlighted previously, new vehicle technologies have the potential to greatly reduce motor vehicle deaths while increasing overall safety and convenience—94% of the 35,000 motor vehicles deaths annually are caused by human error, many of which could be prevented by new accident-avoidance technologies. These life-saving technologies increasingly rely on the new types of data and sensors built into connected cars.
The DOT guidance calls for entities involved in manufacture, design, testing or sale of highly automated vehicle systems in the United States to submit a safety assessment letter that outlines their adherence to the guidelines on 15 specific topics, including privacy and cybersecurity. The letters should be issued at least four months before active public road testing begins on a new automated feature, and again after significant updates. The assessments will initially serve as an optional exercise, but the agencies indicated that they may become mandatory after a public comment period kicked off by yesterday’s guidance and a potential future rulemaking.
The core privacy component of the guidance highlights privacy principles set forward in the White House Consumer Privacy Bill of Rights (CPBR) and the Alliance of Automobile Manufacturers and Global Automakers’ Privacy Principles. It calls for manufacturers, as well as other entities, to take steps to protect consumer privacy by focusing on the Fair Information Privacy Principles of transparency; choice; respect for context; data minimization, de-identification and retention; security; integrity and access; and accountability. The specific application of these concepts for satisfactory completion of the safety assessments will be negotiated in the public comment and review periods to come.
Unlike other aspects of the assessment, privacy is a focus for vehicles operating with lower levels of automation; not just for Highly Automated Vehicles. Much of the document only applies to levels 3-5 of autonomy, but the “Cross-Cutting Areas of Guidance” under which the core privacy sections of the document fall apply to all connected vehicles. While more highly automated vehicles must rely more on data than others, cars of all automation levels increasingly incorporate new data-reliant technologies.
Data sharing is also a core component of the guidance, based on the understanding that safety and product improvements may be best achieved through sharing de-identified vehicle data among industry and regulators. The document also makes clear that NHTSA and the DOT increasingly see themselves relying on new vehicle data to implement their safety and oversight missions, through either special or general authority, and potentially by calling for enhanced data collection tools such as enhanced data recorders (EDRs).
The call for de-identification in data sharing leads the DOT to articulate its reliance on the definition of “personal data” in the Consumer Privacy Bill of Rights, as “data that are under the control of a covered entity, not otherwise generally available to the public through lawful means, and are linked, or as a practicable matter linkable by the covered entity, to a specific individual, or linked to a device that is associated with or routinely used by an individual” (emphasis added). Linkability to an individual is considered key, as the guidance cites both the “as a practical matter linkable” standard from the CPBR and the “reasonably linkable” standard set forth by the FTC. This definition is discussed in a footnote under the “Data Recording and Sharing Section.” Although the definition does not appear in the privacy section, this standard may represent the DOT’s current operational definition of “personal data.”
A future in which new kinds of mobility will expand transportation opportunities for all segments of society will depend on broad collection and use of data to ensure maximum safety and convenience for consumers. This framework certainly allows that use, and creates accountability guidelines that ensure data drives benefits for consumers and society. We look forward to being engaged in the comment and expert-driven processes to refine and implement this guidance.
Supporting Parental Choice for Student Data
Today, the Future of Privacy Forum (FPF) released Supporting Parental Choice for Student Data. The paper discusses the importance of trusting parents to make the final decision on when and where to share their child’s educational information outside of the school environment.
The paper acknowledges valid concerns – that parents may not realize or understand everything they’ve been asked to share, to whom the data will be sent, or all the purposes the data can be used for. However, it asserts that the right solution is not to completely prohibit parental consent and make it illegal, but simply to make it rigorous and informed, and to ensure any data shared will only be used for expressly authorized purposes.
Parents, as those most in-tune with their individual child’s needs, have the right to be an active partner and make the final decision about additional sharing and use of their child’s information.
Today, the Department of Transportation released, Federal Automated Vehicles Policy, which provides guidelines for the safe deployment of automated safety technology. Lauren Smith, FPF Policy Counsel, made the following statement:
“NHTSA has wisely recognized that privacy will play a key role in promoting trust in connected vehicles. Today’s guidance is an important first step in building that trust.
A future in which new kinds of mobility will expand transportation opportunities for all segments of society will depend on broad collection and use of data to ensure maximum safety and convenience for consumers. Today’s framework certainly allows that use, and creates accountability guidelines that ensure data drives benefits for consumers and society.”
Student Privacy Pledge Reaches Milestone of 300 Signatories
FOR IMMEDIATE RELEASE
September 12, 2016
Contact: Melanie Bates, Director of Communications, [email protected]
STUDENT PRIVACY PLEDGE REACHES MILESTONE OF 300 SIGNATORIES
Student Privacy Pledge is a list of legally enforceable commitments companies can take to affirm that they safeguard student data
Launched in 2014 by the Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA), and has now reached over 300 signatories
Endorsed by President Obama, the National PTA, and the National School Boards Association
Washington, DC – The Future of Privacy Forum (FPF) and the Software & Information Industry Association (SIIA) are pleased to announce that the Student Privacy Pledge has passed a new milestone – over 300 ed tech company signatories. The Pledge is a list of commitments that school service providers can make to affirm that K-12 student information is kept private and secure. It has been endorsed by President Obama, the National PTA, and the National School Boards Association.
Companies who take the Student Privacy Pledge commit to twelve legally enforceable obligations, including that they will not sell student personal information, and will not collect or use student personal information other than what is needed for the given educational purposes. The commitments in the Pledge concisely detail existing federal law and regulatory guidance regarding the collection and handling of student data, and encourage service providers to more clearly articulate these practices.
The Pledge was introduced by FPF and SIIA in October 2014 with 14 signatory companies, and it took effect in January 2015 as a legally enforceable agreement for signing companies that provide services to schools. Since then, the number of Pledge signatories has substantially increased, reaching 200 in November 2015, and now passing 300.
“As students return to school for the Fall and teachers develop their curricula to incorporate the benefits of data and technology, companies that take the Pledge are ensuring that they are accountable for how they safeguard student data,” said Jules Polonetsky, CEO, Future of Privacy Forum.
“The continued strength in growth of this Pledge is indicative of the recognition within the industry of our duty to safeguard students and their personal information,” said Brendan Desetti, SIIA’s Director of Education Policy. “The Pledge’s enforceable provisions have also driven a rapid growth of the privacy-minded culture within companies today that places privacy first in the development process alongside functionality.”
The process for becoming a Pledge signatory is often an opportunity for companies to review their own privacy policies and make helpful updates—for example, to make the policy clearer or more understandable to parents and teachers. When a company requests to be added to studentprivacypledge.org, the team at FPF first reads that company’s privacy policy, and although FPF does not certify compliance, companies are only added as Pledge signatories if their policies do not contain any obvious inconsistencies with the text of the Pledge. Some frequent issues include the following:
Companies can easily avoid the most common issue: the policy for changing the policy. The most common issue encountered when companies request to take the Pledge is the provision in their privacy policy regarding future changes to that policy. Often, policies include language like, “we reserve the right to change this policy at any time, and we will notify you by posting the date of revision at the top of this page.” Because the Pledge (and the Federal Trade Commission) require notice and choice for material changes to a privacy policy, a company will not be added as a Pledge signatory until this kind of provision is updated.
Companies should ensure that their privacy policies aren’t written more broadly than necessary. Many privacy policies are written by attorneys who construct the language to be as encompassing as possible. Almost invariably, the company isn’t actually doing anything that conflicts with the Pledge commitments – but an overbroad policy may nonetheless conflict with the Pledge.
FPF and SIIA are proud to facilitate the efforts of education technology companies who demonstrate industry leadership in protecting student privacy by signing the Student Privacy Pledge. Companies and organizations wishing to review the full text of the Pledge and consider participation are invited to visit www.studentprivacypledge.org or email [email protected].
About FPF
The Future of Privacy Forum (FPF) is a Washington, DC-based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts and includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. For more information, visit www.fpf.org.
About SIIA
SIIA is the leading association representing the software and digital content industries. The Education Technology Industry Network (ETIN) of SIIA serves and represents more than 200 of SIIA’s 800 member companies worldwide that provide educational software applications, digital content, online learning services and related technologies across the K-20 sector. SIIA-ETIN shapes and supports the industry by providing leadership, advocacy, government relations, corporate education, intellectual property protection, business development opportunities and critical market information. For more information, visit www.siia.net.
Data and the Future of Mobility: September 14 in San Jose, CA
Join the Future of Privacy Forum for a roundtable: “Data and The Future of Mobility”
Technology is transforming the safety and convenience of the vehicles in which we ride and drive. Along the way, Silicon Valley has become a major hub for auto manufacturers, technology companies, and other entities looking to innovate in the transportation space. Join us in San Jose for a roundtable discussion on data and the future of mobility.
Wednesday, September 14, 2016 from 2:00 PM to 5:00 PM (PST)
Speakers Include:
Jim Adler, Head of Data, Toyota Research Institute
Jonah Houston, Transportation and Mobility Lead, IDEO
Elliot Katz, Global Co-Chair, Connected and Self-Driving Car Practice, DLA Piper
Beth Hill, General Counsel, Chief Compliance Officer and Head of Purchasing, FordDirect
This afternoon event will include several short talks followed by an open, moderated discussion and a reception.
If you work in the connected car ecosystem on topics related to automotive privacy , contact Lauren Smith at [email protected] to request an invitation.
Future of Privacy Forum Releases Best Practices for Consumer Wearables and Wellness Apps and Devices
FOR IMMEDIATE RELEASE
August 17, 2016
Contact: Melanie Bates, Director of Communications, [email protected]
FUTURE OF PRIVACY FORUM RELEASES BEST PRACTICES FOR
CONSUMER WEARABLES AND WELLNESS APPS AND DEVICES
Document calls for restrictions on data sharing, enhanced notices, and informed consent for research
FPF also releases new study highlighting improvement in availability of app privacy policies, but gap for top health and fitness apps
Only 54% of sleep aid apps provide easily accessible privacy policy
Washington, DC – Today, the Future of Privacy Forum (FPF) released Best Practices for Consumer Wearables and Wellness Apps and Devices, a detailed set of guidelines that responsible companies can follow to ensure they provide practical privacy protections for consumer-generated health and wellness data. The document was produced with support from the Robert Wood Johnson Foundation and incorporates input from a wide range of stakeholders including companies, advocates, and regulators.
“Fitness and wellness data from apps and wearables provide significant benefits for users, but it is essential that companies incorporate Fair Information Practice Principles to safeguard this data,” said Jules Polonetsky, FPF’s CEO.
“Overcoming privacy concerns associated with wearable technologies is necessary to ensure their equitable access and use by global populations,” said Derek Yach, Chief Health Officer & Gillian Christie, Health Innovation Analyst, Vitality. “The Future of Privacy Forum’s guidance on consumer wearables and wellness devices showcases these challenges and explicitly outlines best practices for companies engaged in designing and deploying these technologies.”
The Best Practices build on current legal protections and app platform guidelines by providing specific guidance to ensure consumer apps include appropriate privacy protections, as well as developing responsible guidelines for research and other secondary uses of consumer-generated wellness data. The U.S. Department of Health and Human Services (HHS) articulated significant gaps in regulating health information privacy and security in a report released last month. HHS recognized that while technological innovation has advanced at an extraordinary pace in recent years, privacy and security protections of health information have not kept up.[1] The Best Practices released today begin to build norms for such data by making recommendations for privacy practices that:
Provide consumers choices about the sharing and use of their data;
Support interoperability with global privacy frameworks and leading app platform standards; and
Elevate data norms around research, privacy, and security.
“Some data collected from wearables may be relatively trivial, but other data can be highly sensitive,” said Kelsey Finch, Policy Counsel, FPF. “These principles are tailored to provide appropriate protections calibrated to the nature and sensitivity of the data.”
In addition, a new FPF Mobile Apps Study underscores the necessity of strong Best Practices for health and wellness data. The App Study revealed that while the number of apps that provide privacy policies continues its upward trend from our previous surveys in 2011 and 2012, health and fitness apps – which may access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device – do worse than average at providing privacy policies. Only 70% of top health and fitness apps had a privacy policy (6% lower than overall top apps), and only 61% linked to it from the app platform listing page (10% lower than overall top apps).
The App Study also looked specifically at period tracking and sleep aid apps. Only 63% of period tracking apps provided a link to the privacy policy from the app platform listing page. More disappointingly, only 54% of sleep aid apps provided a link to the privacy policy from the app platform listing page.
“Even though a privacy policy is not the be all and end all for building consumer trust, there is no excuse for failing to provide one – doing so is the baseline standard,” said John Verdi, FPF’s Vice President of Policy. “App platforms have made it easier for developers to provide access to privacy policies. Consumers expect direct access to privacy policies, and users can review them before downloading an app.”
###
The Future of Privacy Forum (FPF) is Washington, DC based think tank that seeks to advance responsible data practices. FPF includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. Learn more by visiting www.fpf.org.
The Future of Privacy Forum (FPF) filed its report, Always On: Privacy Implications of Microphone-Enabled Devices, with the Federal Trade Commission (FTC) in response to the Commission’s request for public comments regarding the privacy implications of Smart TVs. On December 7, 2016, the FTC will be holding a Smart TV Workshop to explore the intricacies of tracking technologies and best practices for addressing consumer privacy on entertainment systems.
At FPF, we recognize that as the next generation of Smart TVs enters the market, there are increasing concerns about voice privacy and the role of speech recognition in home appliances. In our report, we examine these privacy questions and identify the emerging best practices of “always on” devices, including Smart TVs. Key questions include:
Whether data processing and storage occurs locally or externally (i.e., cloud-based);
Whether the device arrives with audio recording functionality pre-enabled;
Whether the device provides visual cues that indicate when it is transmitting information; and
Whether consumers are given the ability to access and delete stored audio files.
Despite the fact that many TVs and other devices dubbed “always on” are not in fact “always listening,” microphones and voice data retain unique social and legal significance that should be taken into consideration in discussions of Smart TVs and privacy.
iOS 10 to Feature Stronger "Limit Ad Tracking" Control
The release of iOS 10, the newest version of Apple’s mobile operating system (coming this Fall), will bring an array of new features and upgrades, and a change to the functionality of the “Limit Ad Tracking” privacy setting.
The iPhone operating system allows app developers to target advertisements to app users by using a unique ID number called “Identifier for Advertising” (IDFA or IFA). This advertising identifier functions similarly to the way cookies are used by web browsers, allowing third parties to recognize a user over time and across different apps. In the iPhone’s Privacy Settings, the user can re-set that identifier at any time, and also has the option to select “Limit Ad Tracking.”
In iOS 9 and previous versions, selecting “Limit Ad Tracking” meant that the OS would send a “flag” indicating that that user had enabled the Limit Ad Tracking feature. Upon collecting this flag, developers were required under the Apple Developer Terms to not allow use of the identifier for “targeted” advertising. Most ad networks treated this flag as a user request to opt-out of “behavioral advertising” or “interest based advertising.” Some ad networks continued to target ads based on location or continued to use the ad to help enable cross-device tracking. Other companies treated the flag as a broader opt-out of any targeting and tracking. Apple specifically permitted companies to continue to use the ID for certain limited other uses when Limit ad Tracking was enabled, including “frequency capping, attribution, conversion events, estimating the number of unique users, advertising fraud detection, and debugging.” (iOS Developer Library)
Beginning in iOS 10, when a user enables “Limit Ad Tracking,” the OS will send along the advertising identifier with a new value of “00000000-0000-0000-0000-000000000000.” This will more effectively ensure that users are no longer targeted or tracked by many ad networks across sites. or over time. But it will also prevent the previously permitted “frequency capping, attribution, conversion events, estimating the number of unique users, advertising fraud detection, and debugging” uses of this ID.
The number of users who enable “Limit Ad Tracking” is now at roughly 17% of iPhone users, down from earlier years. Some speculate this is due to users moving on to use ad-blocking.
Apple previously barred tracking using other device identifiers, technically blocking access to most other fixed IDs and restricting use of others by policy. Apps still have access to a unique Vendor ID that they can use for internal purposes. And some apps that register or authenticate users may be able to continue to enable some kinds of tracking or targeting. But it is clear that Apple has turned the Limit Ad Tracking setting into a much more significant privacy option, for users that choose to use it.
FPF, Intel, and PrecisionHawk Release Drones and Privacy by Design: Embedding Privacy Enhancing Technology in Unmanned Aircraft
FOR IMMEDIATE RELEASE
August 2, 2016
Contact: Melanie Bates, Director of Communications, [email protected]
FUTURE OF PRIVACY FORUM, INTEL, AND PRECISIONHAWK RELEASE
“FPF is proud to join with Intel and PrecisionHawk to release today’s report highlighting technologies and practices that help drone operators minimize collection and retention of personal data, obfuscate images of individuals collected from the air, and secure personally identifiable information,” said Jules Polonetsky, FPF’s CEO. “The widespread adoption of geo-fencing and other technologies is enabling drones to reduce privacy risks while they tackle important, often life-saving missions.”
Diana Marina Cooper, Senior Director of Legal and Policy Affairs for PrecisionHawk said, “We often hear about privacy concerns raised by drone technologies. This report shows how drone solutions can, and are, helping us solve real privacy concerns. I hope this report inspires developers of drone technologies to seek out their own ways of building privacy into their products.”
The report describes concrete examples of how drone manufacturers, operators, and others are employing Privacy-by-Design principles to help users respect privacy and promote trust in drone operations. Privacy-by-Design principles state that developers should ask questions about what data is collected, how it is used, with whom it is shared, how much of that data is retained, and how data is stored and protected. In doing so, they consider the benefits and risks of the use of data, and what steps can be taken to mitigate risk.
“The benefits drones promise are possible only if individuals trust that the technology will be used in ways that benefit them, their community and society” said Paula J. Bruening, Intel Senior Counsel. “They must also be confident that the information drones gather and process is protected and processed in ways that respect their privacy.”
Drones and Privacy by Design describes techniques that can be used to safeguard privacy and support responsible data practices, including practices described in the May 2016 stakeholder-drafted Voluntary Best Practices for UAS Privacy, Transparency, and Accountability. “The best practices were developed through hard-won consensus within the drone community,” said John Verdi, FPF’s Vice President of Policy. “They articulate important principles that should guide drone operators’ data practices, and today’s report describes practical safeguards that can help operators implement those principles.”
“Innovative commercial and government platforms and applications for UAS are helping to solve problems, save money, conserve critical resources, and even save lives,” said U.S. Chief Technology Officer Megan Smith. “The Administration will continue collaborating with public and private sector entities to further understand and explore safe and beneficial application of this emergent technology.”
###
The Future of Privacy Forum (FPF) is a Washington, DC, based think tank that seeks to advance responsible data practices. FPF includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. Learn more about FPF’s work by visiting www.fpf.org.