Data and the Future of Mobility: September 14 in San Jose, CA

Join the Future of Privacy Forum for a roundtable: “Data and The Future of Mobility

Technology is transforming the safety and convenience of the vehicles in which we ride and drive. Along the way, Silicon Valley has become a major hub for auto manufacturers, technology companies, and other entities looking to innovate in the transportation space. Join us in San Jose for a roundtable discussion on data and the future of mobility.

Wednesday, September 14, 2016 from 2:00 PM to 5:00 PM (PST) 

Speakers Include:

Jim Adler, Head of Data, Toyota Research Institute

Michelle Avary, VP Automotive Strategy & Products, Aeris

Jonah Houston, Transportation and Mobility Lead, IDEO

Elliot Katz, Global Co-Chair, Connected and Self-Driving Car Practice, DLA Piper

Beth Hill, General Counsel, Chief Compliance Officer and Head of Purchasing, FordDirect

This afternoon event will include several short talks followed by an open, moderated discussion and a reception.

If you work in the connected car ecosystem on topics related to automotive privacy , contact Lauren Smith at [email protected] to request an invitation.

 

 

 

Future of Privacy Forum Releases Best Practices for Consumer Wearables and Wellness Apps and Devices

FOR IMMEDIATE RELEASE

August 17, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

FUTURE OF PRIVACY FORUM RELEASES BEST PRACTICES FOR

CONSUMER WEARABLES AND WELLNESS APPS AND DEVICES

Washington, DC – Today, the Future of Privacy Forum (FPF) released Best Practices for Consumer Wearables and Wellness Apps and Devices, a detailed set of guidelines that responsible companies can follow to ensure they provide practical privacy protections for consumer-generated health and wellness data. The document was produced with support from the Robert Wood Johnson Foundation and incorporates input from a wide range of stakeholders including companies, advocates, and regulators.

“Fitness and wellness data from apps and wearables provide significant benefits for users, but it is essential that companies incorporate Fair Information Practice Principles to safeguard this data,” said Jules Polonetsky, FPF’s CEO.

“Overcoming privacy concerns associated with wearable technologies is necessary to ensure their equitable access and use by global populations,” said Derek Yach, Chief Health Officer & Gillian Christie, Health Innovation Analyst, Vitality. “The Future of Privacy Forum’s guidance on consumer wearables and wellness devices showcases these challenges and explicitly outlines best practices for companies engaged in designing and deploying these technologies.”

The Best Practices build on current legal protections and app platform guidelines by providing specific guidance to ensure consumer apps include appropriate privacy protections, as well as developing responsible guidelines for research and other secondary uses of consumer-generated wellness data. The U.S. Department of Health and Human Services (HHS) articulated significant gaps in regulating health information privacy and security in a report released last month. HHS recognized that while technological innovation has advanced at an extraordinary pace in recent years, privacy and security protections of health information have not kept up.[1] The Best Practices released today begin to build norms for such data by making recommendations for privacy practices that:

“Some data collected from wearables may be relatively trivial, but other data can be highly sensitive,” said Kelsey Finch, Policy Counsel, FPF. “These principles are tailored to provide appropriate protections calibrated to the nature and sensitivity of the data.”

In addition, a new FPF Mobile Apps Study underscores the necessity of strong Best Practices for health and wellness data. The App Study revealed that while the number of apps that provide privacy policies continues its upward trend from our previous surveys in 2011 and 2012, health and fitness apps – which may access sensitive, physiological data collected by sensors on a mobile phone, wearable, or other device – do worse than average at providing privacy policies. Only 70% of top health and fitness apps had a privacy policy (6% lower than overall top apps), and only 61% linked to it from the app platform listing page (10% lower than overall top apps).

The App Study also looked specifically at period tracking and sleep aid apps. Only 63% of period tracking apps provided a link to the privacy policy from the app platform listing page. More disappointingly, only 54% of sleep aid apps provided a link to the privacy policy from the app platform listing page.

“Even though a privacy policy is not the be all and end all for building consumer trust, there is no excuse for failing to provide one – doing so is the baseline standard,” said John Verdi, FPF’s Vice President of Policy. “App platforms have made it easier for developers to provide access to privacy policies. Consumers expect direct access to privacy policies, and users can review them before downloading an app.”

###

The Future of Privacy Forum (FPF) is Washington, DC based think tank that seeks to advance responsible data practices. FPF includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. Learn more by visiting www.fpf.org.

[1] Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA. By: U.S. Department of Health and Human Services. Available at, https://www.healthit.gov/sites/default/files/non-covered_entities_report_june_17_2016.pdf (July 19, 2016).

Examining the Privacy Implications of Smart TVs

The Future of Privacy Forum (FPF) filed its report, Always On: Privacy Implications of Microphone-Enabled Devices, with the Federal Trade Commission (FTC) in response to the Commission’s request for public comments regarding the privacy implications of Smart TVs. On December 7, 2016, the FTC will be holding a Smart TV Workshop to explore the intricacies of tracking technologies and best practices for addressing consumer privacy on entertainment systems.

At FPF, we recognize that as the next generation of Smart TVs enters the market, there are increasing concerns about voice privacy and the role of speech recognition in home appliances. In our report, we examine these privacy questions and identify the emerging best practices of “always on” devices, including Smart TVs. Key questions include:

Despite the fact that many TVs and other devices dubbed “always on” are not in fact “always listening,” microphones and voice data retain unique social and legal significance that should be taken into consideration in discussions of Smart TVs and privacy.

iOS 10 to Feature Stronger "Limit Ad Tracking" Control

The release of iOS 10, the newest version of Apple’s mobile operating system (coming this Fall), will bring an array of new features and upgrades, and a change to the functionality of the “Limit Ad Tracking” privacy setting.

The iPhone operating system allows app developers to target advertisements to app users by using a unique ID number called “Identifier for Advertising” (IDFA or IFA). This advertising identifier functions similarly to the way cookies are used by web browsers, allowing third parties to recognize a user over time and across different apps. In the iPhone’s Privacy Settings, the user can re-set that identifier at any time, and also has the option to select “Limit Ad Tracking.”

screen caps

 

In iOS 9 and previous versions, selecting “Limit Ad Tracking” meant that the OS would send a “flag”  indicating that that user had enabled the Limit Ad Tracking feature. Upon collecting this flag, developers were required under the Apple Developer Terms to not allow use of  the identifier for “targeted” advertising. Most ad networks treated this flag as a user request to opt-out of “behavioral advertising” or “interest based advertising.” Some ad networks continued to target ads based on location or continued to use the ad to help enable cross-device tracking. Other companies treated the flag as a broader opt-out of any targeting and tracking. Apple specifically permitted companies to continue to use the ID for certain limited other uses when Limit ad Tracking was enabled, including “frequency capping, attribution, conversion events, estimating the number of unique users, advertising fraud detection, and debugging.” (iOS Developer Library)

Beginning in iOS 10, when a user enables “Limit Ad Tracking,” the OS will send along the advertising identifier with a new value of “00000000-0000-0000-0000-000000000000.” This will more effectively ensure that users are no longer targeted or tracked by many ad networks across sites. or over time. But it will also prevent the previously permitted “frequency capping, attribution, conversion events, estimating the number of unique users, advertising fraud detection, and debugging” uses of this ID.

The number of users who enable “Limit Ad Tracking” is now at roughly 17% of iPhone users, down from earlier years. Some speculate this is due to users moving on to use ad-blocking.

Apple previously barred tracking using other device identifiers, technically blocking access to most other fixed IDs and restricting use of others by policy. Apps still have access to a unique Vendor ID that they can use for internal purposes. And some apps that register or authenticate users may be able to continue to enable some kinds of tracking or targeting. But it is clear that Apple has turned the Limit Ad Tracking setting into a much more significant privacy option, for users that choose to use it.

FPF, Intel, and PrecisionHawk Release Drones and Privacy by Design: Embedding Privacy Enhancing Technology in Unmanned Aircraft

FOR IMMEDIATE RELEASE              

August 2, 2016

Contact: Melanie Bates, Director of Communications, [email protected]

 

FUTURE OF PRIVACY FORUM, INTEL, AND PRECISIONHAWK RELEASE

REPORT ON DRONES AND PRIVACY BY DESIGN

Washington, DC – Today, in response to the Administration’s call-to-action on privacy protections related to drone operations, Future of Privacy Forum (FPF), Intel, and PrecisionHawk released Drones and Privacy by Design: Embedding Privacy Enhancing Technology in Unmanned Aircraft. The report highlights examples of privacy enhancing technologies and “Privacy-by-Design” applied to drones.

“FPF is proud to join with Intel and PrecisionHawk to release today’s report highlighting technologies and practices that help drone operators minimize collection and retention of personal data, obfuscate images of individuals collected from the air, and secure personally identifiable information,” said Jules Polonetsky, FPF’s CEO.  “The widespread adoption of geo-fencing and other technologies is enabling drones to reduce privacy risks while they tackle important, often life-saving missions.”

Diana Marina Cooper, Senior Director of Legal and Policy Affairs for PrecisionHawk said, “We often hear about privacy concerns raised by drone technologies.  This report shows how drone solutions can, and are, helping us solve real privacy concerns.  I hope this report inspires developers of drone technologies to seek out their own ways of building privacy into their products.”

The report describes concrete examples of how drone manufacturers, operators, and others are employing Privacy-by-Design principles to help users respect privacy and promote trust in drone operations. Privacy-by-Design principles state that developers should ask questions about what data is collected, how it is used, with whom it is shared, how much of that data is retained, and how data is stored and protected.  In doing so, they consider the benefits and risks of the use of data, and what steps can be taken to mitigate risk.

“The benefits drones promise are possible only if individuals trust that the technology will be used in ways that benefit them, their community and society” said Paula J. Bruening, Intel Senior Counsel. “They must also be confident that the information drones gather and process is protected and processed in ways that respect their privacy.”

Drones and Privacy by Design describes techniques that can be used to safeguard privacy and support responsible data practices, including practices described in the May 2016 stakeholder-drafted Voluntary Best Practices for UAS Privacy, Transparency, and Accountability.  “The best practices were developed through hard-won consensus within the drone community,” said John Verdi, FPF’s Vice President of Policy. “They articulate important principles that should guide drone operators’ data practices, and today’s report describes practical safeguards that can help operators implement those principles.”

FPF released Drones and Privacy by Design in conjunction with the White House’s first-ever Workshop on Drones and the Future of Aviation.

“Innovative commercial and government platforms and applications for UAS are helping to solve problems, save money, conserve critical resources, and even save lives,” said U.S. Chief Technology Officer Megan Smith. “The Administration will continue collaborating with public and private sector entities to further understand and explore safe and beneficial application of this emergent technology.”

 

###

The Future of Privacy Forum (FPF) is a Washington, DC, based think tank that seeks to advance responsible data practices. FPF includes an advisory board comprised of leading figures from industry, academia, law, and advocacy groups. Learn more about FPF’s work by visiting www.fpf.org.

Privacy Shield Starts, Now What About that Safe Harbor Statement in Your Policy?

As of today, companies have the ability to self-certify as members of the EU-US Privacy Shield.  It may also be a good day to review the Safe Harbor language many companies have retained in their privacy policies.

Many companies have retained Safe Harbor language in their policies, even after adopting model contracts.  They have done so for a number of logical reasons: they still follow the rules, previously collected data is covered under the Safe Harbor commitment, temporarily changing the policy could be confusing, and contractual obligations to be in Safe Harbor continue. Keeping Safe Harbor langauge has been the opinion of many outside counsel.

But some EU regulators may not agree.  The CNIL claim last week against Microsoft follows a similar claim against Facebook regarding Safe Harbor language in their privacy policy. The CNIL pointed to the Safe Harbor language in the privacy policy to make the claim that the companies had no legal basis to transfer data to the US.

In each case, the companies had model contracts in place. The Microsoft policy even specifically explains that the company uses “a variety of legal mechanisms, including contracts, to help ensure your rights and protections travel with your data.”

The CNIL charge is difficult to understand.  But given the CNIL position, perhaps companies should add even more language to their policies explaining the purpose of retaining the Safe Harbor language and expressly detailing that it is not the current legal basis for data transfer.  Perhaps removing the language altogether is a good idea. Perhaps the CNIL and other Data Protection Authorities will only raise this issue with big companies as an aside to other issues they are contesting.

Read the CNIL Complaint

Read Microsoft’s Privacy Policy