With much media attention focused on new Apple hardware, including new iPhones, Apple also released updated versions of its mobile and desktop operating systems for public download this week. The software upgrades (iOS 12 for iPhones, and macOS 10.14 Mojave for desktop Macs) bring many new features, such as Group FaceTime, options to customize notifications, and aesthetic changes such as an optional desktop “Dark Mode.”
Amidst these upgrades, what’s new for data privacy? Consumers are increasingly aware of privacy issues, and Apple has articulated the company’s commitment to privacy “as a human right.” Meanwhile, regulators are entering the consumer privacy debate, with this week’s Senate hearing bringing further attention to the data practices and policy positions of leading technology companies.
In their Fall updates, Apple improves several existing privacy controls for iPhone users, and MacOS 10.14 brings privacy-focused technical modifications. Several of these updates were first announced at Apple’s June 2018 Worldwide Developer Conference (WWDC), which we discussed (along with major updates to Google’s Android P) earlier this summer, and have now been released to the public.
Below, we provide round-ups of privacy updates in iOS 12, macOS 10.14 (Mojave), and the App Store Review Guidelines.
Privacy Updates in iOS 12
The following privacy updates are included in iOS 12, which can be downloaded on devices as old as the iPhone 5s and iPad Air.
USB Restricted Mode: In July 2018, Apple released iOS 11.4.1 and introduced USB Restricted Mode. This feature requires iPhone users to input their passcode to unlock the phone when connecting it to a USB accessory if the phone has been locked for an hour or more. iOS 12 implements additional USB restrictions, including disabling USB connections immediately after the device locks if more than three days have passed since the last USB connection. This increases protection for users that don’t often make such connections. Overall, the USB Restricted Mode makes it much more difficult for an unauthorized person or entity, such as a stalker or phone thief, to unlock a user’s phone without permission.
On-Device Machine Learning for Siri Suggestions: A new feature called Siri Suggestions uses machine learning to decide what apps and shortcuts to surface as a banner on the iPhone home screen. Siri Suggestions will be based on users’ patterns from signals like location, time of day and type of motion (e.g., walking, running, or driving). iOS 12 analyzes this data locally on the device rather than on remote servers, providing users with personalized experiences while limiting access to the underlying information.
Privacy Updates in macOS 10.14 (Mojave)
iOS-Style Permissions for Desktop Apps: MacOS apps will now be required to request the user’s permission to access certain device sensors, such as the camera or microphone. These permissions have long been standard on iOS and other mobile operating systems.
Intelligent Tracking Prevention 2.0: Building on a feature introduced last year in the Safari browser, Apple is introducing Intelligent Tracking Prevention 2.0 (ITP 2.0) as a default feature for Safari in macOS Mojave. Earlier versions of ITP used machine learning to prevent websites from placing cookies that were identified as having “tracking abilities” after a 24-hour window. ITP 2.0 expands on this feature by immediately partitioning such third-party cookies. As a result, Safari will now prevent most website tracking from social media “Share” and “Like” buttons and other embedded content, unless the user consents to the data collection in a browser-prompted notification.
Obfuscation of Device Fingerprints: Safari in macOS Mojave will contain updates designed to prevent device fingerprinting. As FPF described in a 2015 report on cross-device tracking, devices and browsers can be identified with a degree of probability through metadata sent in web traffic – such as the system fonts, screen size, installed plug-ins, etc. This kind of digital “fingerprinting,” often referred to as server-side recognition, is often used for short-term advertising attribution and measurement. In Mojave, the Safari web browser will present websites with a “simplified system configuration,” in order to make many users’ “fingerprints” appear identical or very similar – reducing the efficacy of server-side recognition technologies.
Privacy Updates for Developers (App Store Review Guidelines)
In addition to technical updates to their operating systems, Apple has also made significant changes to its App Store Review Guidelines, the rules for how developers may collect and use personal information from users. These Guidelines, which apply to the third party developers who provide apps through the App Store, can be very influential when paired with robust oversight. In May, Apple began removing apps from the App Store for violations of policies against sharing location data with third-party advertisers without users’ consent. In August, Apple removed apps from the App Store that violated its policies against collecting data to build user profiles or contact databases.
Updates to the Guidelines include:
Developers are not permitted to create databases from users’ address book information (contact lists and photos). (5.1.2)
Developers must clearly describe new features and changes in the “What’s New” section of the App Store. (2.3.12)
Developers must request explicit user consent and provide a “clear visual indication when recording, logging, or otherwise making a record of user activity.” (2.5.14)
Developers must provide users with all information used to target a user with an ad without leaving the app. (3.1.3(b))
All apps must include a link to their privacy policy in the App Store Connect metadata field and within the app. (5.1.1)
Developers must include a mechanism to revoke social network credentials and disable data access between the app and social network from within the app. (5.1.1)
Developers must respect the user’s permission settings and not attempt to manipulate, trick, or force people to consent to unnecessary data access. (5.1.1)
New language in the Developer Code of Conduct states:
“Customer trust is the cornerstone of the App Store’s success. Apps should never prey on users or attempt to rip-off customers, trick them into making unwanted purchases, force them to share unnecessary data, raise prices in a tricky manner, charge for features or content that are not delivered, or engage in any other manipulative practices within or outside of the app.” (5.5)
Summary
Amidst new hardware and design features, Apple has introduced important technical updates to iOS 12 and MacOS that aim to empower users to better manage their information. Developers should also take note of significant changes to the App Store Review Guidelines, which help determine the ways in which apps and app partners can collect and handle user data. These changes help provide users with the ability to make more informed decisions that reflect their privacy preferences.
FPF Releases Understanding Facial Detection, Characterization, and Recognition Technologies and Privacy Principles for Facial Recognition Technology in Commercial Applications
These resources will help businesses and policymakers better understand and evaluate the growing use of face-based biometric technology systems when used for consumer applications. Facial recognition technology can help users organize and label photos, improve online services for visually impaired users, and help stores and stadiums better serve customers. At the same time, the technology often involves the collection and use of sensitive biometric data, requiring careful assessment of the data protection issues raised. Understanding the technology and building trust are necessary to maximize the benefits and minimize the risks.
These Principles define a benchmark of privacy requirements for those commercial situations where technology collects, creates, and maintains a facial template that can be used to identify a specific person – enabling the beneficial applications and services, while providing the necessary protections for individuals.
Government use of facial recognition technology has drawn a great deal of recent attention: in border control, in law enforcement, and in combatting terrorism. Others have published extensive reports on these topics. Some companies have called for regulation of how governments use these technologies, and others have refused to license the their facial recognition systems to the government until racial disparities and other accuracy challenges are overcome. Many have criticized the US government’s failure to ensure the fairness of its own biometric systems and protested law enforcement agencies who seek exemptions from long-standing accuracy requirements.
FPF agrees that we need public discussion and thoughtful regulatory action regarding government use of facial recognition technologies. These important issues deserve careful consideration and action in the context and history of government surveillance, algorithmic fairness, and equity; they are beyond the scope of our Privacy Principles for commercial use. We do not make recommendations regarding these issues in the publications we release today, but look forward to participation in the important policy discussions with civil society and government that are necessary.
The consumer-facing applications of facial recognition technology continue to evolve, and the technology will certainly be used in new ways in the future. FPF’s Privacy Principles for Facial Recognition Technology in Consumer Applicationsincludes seven core privacy principles that address the concerns surrounding personally identifiable information (PII) (templates of individual faces) collected by these systems. We expect these Principles will be used by companies as a resource for the development, refinement, and implementation of facial recognition technology in commercial settings.
The Privacy Principles Include:
Consent
Use – Respect for Context
Transparency
Data Security
Privacy by Design
Integrity and Access
Accountability
As more retailers and private companies increasingly employ various levels of facial scanning technology online and in person, and as applications for photo organizing, tagging, and sharing grow, it is time to push for greater consensus on what the appropriate privacy standards should be for commercial use cases, and what protections consumers should reasonably expect.
Equally relevant is the need to expand stakeholders’ awareness and understanding of the many types of facial scanning systems, as well as the impact of accuracy differences among the many systems available today. It is important to understand the distinctions between facial detection systems (which when properly designed neither create nor implicate any Personally Identifiable Information) with full-scale facial identification programs (matching a person’s image to a database in order to identify the individual to a store clerk or stadium employee who otherwise wouldn’t recognize them).
FPF’s graphic Understanding Facial Detection, Characterization, and Recognition Technologies summarizes the key distinctions between facial scanning technologies for easy reference. Relating each technology to its common use cases, benefits, concerns, and risk of identifiability, we have also outlined the minimum recommended notice and consent requirements and the Operator’s responsibilities.
In the case of facial characterization, for example, no PII is typically retained. However, characterization technologies can inform businesses whether individuals are smiling or frowning, male or female, and old or young. That means people may be treated differently – men and women may see different ads displayed on a sign, or there may be customized offers for parents or older individuals – in which case it’s important to understand that the system has been tailored to identify characteristics but not a unique identity. In these cases, notice is the key requirement, as well as attention to ensure there is no discriminatory activity. But in the examples that do involve creation of facial templates and identification, the guidance calls for increasing levels of consent.
We are pleased to announce the launch of our Privacy Book Club! The FPF Privacy Book Club will provide members with the opportunity to read a wide range of books — privacy, data, ethics, academic works, and other important data relevant issues — and have an open discussion of the selected literature.
The FPF Privacy Book Club will be held on the last Wednesday of each month. A virtual conference dial-in will be sent to book club members, which will include a video chat, phone line, and an online chat. You can join the Privacy Book Club by registering here. Please feel free to share the sign up link with your friends and colleagues who may be interested in participating.
The first FPF Privacy Book Club will be held Wednesday, September 26, 2018, at 2:00 pm (EST). We are excited to share that FPF Advisory Board member and author, Professor Woodrow Hartzog, will be joining the discussion to introduce his book, Privacy’s Blueprint: The Battle to Control the Design of New Technologies, and to answer a few questions. After hearing from Woody, we will host an open discussion of the book for the remainder of the meeting.
To learn more about FPF’s Privacy Book Club or to provide suggestions for future readings, please contact Michelle Bae, FPF Berkower Memorial Fellow, at [email protected].
Video
If privacy principles are from Venus, then engineering rules are from Mars
FPF Advisory Board member, Alisa Bergman, Vice President, Chief Privacy Officer at Adobe Systems, recently wrote an article in the IAPP Tech Privacy Advisor that we think is very useful. The article started from a presentation Bergman did for Adobe engineers (the key slide from the presentation is above).
For those participating in FPF’s Privacy War Games, the article is particularly useful as it provides a practical look at how to operationalize privacy. Bergman is a member of the Advisory Committee for the PWG. To learn more about the event and to register, please visit www.privacywargames.com.
Bergman writes:
Whether you’re a privacy professional or a software engineer, you likely have many stories about the opportunities, challenges and spirited debates within your organization, particularly in the recent run-up to EU General Data Protection Regulation. Privacy and engineering teams ultimately share a common goal: to create great customer and user experiences. Getting there, however, can feel like the other team comes from a different planet. To engineers, privacy principles advanced by privacy professionals may appear to come from Venus, while to privacy pros, engineering rules may appear to come from Mars.
September 24th in Detroit, MI: Data and Privacy for Autonomous Vehicles
Join the Future of Privacy Forum and PwC for a roundtable: “Data and Privacy for Autonomous Vehicles”
Autonomous vehicles are positioned to transform the future of mobility—a change enabled by new on-board sensors that collect and transmit growing types and quantities of data. While the existence of data in vehicles is not entirely new, autonomous vehicles promise an explosion in the variety, connectivity, volume of such data—raising new and unique considerations around what happens with it. As the automotive industry becomes more data-driven, getting consumer privacy right will become increasingly important.
Monday, September 24, 2018 from 1:30 PM to 4:30 PM in Detroit, Michigan
Industry leaders will discuss:
Mapping, Geolocation, and Video Data: What unique considerations may need to be applied to these compilations of high resolution and potentially sensitive data—of both the inside and outside of vehicles?
Monitoring of Vehicles, Drivers, and Passengers: What are best practices for fleet monitoring to ensure safety and cleanliness of vehicles while preserving passenger privacy?
Data Sharing: Given that many consider data sharing crucial to AV development, what considerations should be taken into account around how to do so while protecting privacy – both among industry, with municipalities, and with researchers?
Law Enforcement Access: How do we balance access during exigent circumstances and through legal process with enhanced sensitivities around mobility data?
Changing Expectations of Privacy: Do consumers’ expectations of privacy diminish in the post-ownership, shared vehicle model? How should we think about privacy considerations for members of the public whose movements may be tracked with the externally-facing videos or mapping that may be common with AVs?
These are all early-stage questions, so we are convening industry leaders for an off-the-record discussion on how to tackle these issues moving forward.
Please contact Lauren Smith at [email protected] to request an invitation.
FPF Launches AI and Machine Learning Working Group and Releases New AI Resource Guides
The opportunities created by machine learning range across every sector of the economy and will impact every area of daily life. Better health care, safer transportation, greater efficiencies in manufacturing, retail, and online services are all already on the horizon. But concerns about the privacy impact, ethical consequences and real world harms of the technology are real and may lead to dangers large and small if not countered.
FPF has convened a leading group of its members to consider priority areas for technologies and companies to address ML privacy and ethics concerns. Our AI and Machine Learning Working Group, composed of FPF member companies with an interest in AI and Machine Learning privacy and data management challenges, meets monthly to discuss various relevant issues regarding new updates, hear from experts regarding AI in the EU and under GDPR, the occurrence and defense against bias, and other timely topics. A subset of members is currently working on developing a set of Best Practices for internal governance procedures for companies implementing products or services which rely on machine learning technologies.
We recently released, in partnership with Immuta, a framework for practitioners to manage risk in artificial intelligence and machine learning (ML) models. The joint whitepaper, Beyond Explainability: A Practical Guide to Managing Risk in Machine Learning Models, provides business executives, data scientists, and compliance professionals with a strategic guide for governing the legal, privacy, and ethical risks associated with this technology.
Today we announce the launch of our Resource Guides which collect current and leading news, academic publications, and multi-media training, providing in-depth sources for the technical, educational, and policy-focused perspectives.
In October, we will be holding one of the official side events at the 40th International Conference of Data Protection and Privacy Commissioners in Brussels, providing an introduction to how AI and ML work, targeted to privacy professionals and regulators, with an emphasis on the unique privacy questions or impacts raised by these technologies.
To learn more about FPF’s AI and ML Working Group, and membership opportunities, contact Brenda Leong, [email protected].
John Verdi Appears on Comcast Newsmakers
FPF’s Vice President of Policy, John Verdi, joined Sheila Hyland on Comcast Newsmakers to talk about the importance of data privacy laws, including new legislation in California, which aim to empower consumers. You can watch the full interview below.
Video
FPF Welcomes Inaugural Elise Berkower Memorial Fellow
The Fellowship
Earlier this year, FPF launched a new fellowship in memory of Elise Berkower. Elise was a senior privacy executive at global measurement and data analytics company Nielsen for nearly a decade and was a valued, longtime member of the FPF Advisory Board. FPF graciously acknowledges the Berkower Family, Nielsen Foundation, and IAPP as founding sponsors of the Elise Berkower Memorial Fellowship.
The Elise Berkower Memorial Fellowship is a one-year, public interest position for recent law school graduates committed to the advancement of responsible data practices. Candidates are selected based on both academic qualifications and a commitment to the personal qualities exemplified by Elise – collaboration with co-workers, peers and the broader privacy community and a commitment to ethical conduct.
Michelle Bae
We are delighted to welcome the inaugural Elise Berkower Memorial Fellow, Michelle Bae!
Michelle will work on consumer and commercial privacy issues, from technology-specific areas such as advertising practices, drones, wearables, connected cars, and student privacy, to general data management and privacy issues related to ethics, de-identification, algorithms, and the Internet of Things. Focusing on the advancement of responsible data practices, she will be developing industry best practices and standards and filing comments on proposed regulatory actions.
During law school, Michelle worked as a data protection and privacy law intern at Ladas & Parry LLP and a legal intern for the Honorable Judge Timothy S. Driscoll at the Nassau County Commercial Division and the New York State Attorney General’s Office. Prior to law school, Michelle worked in the corporate governance team at Citigroup where she managed board agenda materials with respect to internal data control and developed an online platform.
Michelle earned her J.D. from University of Illinois College of Law and her B.A. in Law from Yonsei University.
Please join us in welcoming Michelle to the team!
Donate
FPF is inviting supporters to help us fully fund the Elise Berkower Memorial Fellowship. Donations of all sizes are welcome. Thank you for your support!
Privacy Best Practices for Consumer Genetic Testing Services
The Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. These companies have been joined by African Ancestry, FamilyTreeDNA,* and Living DNA in supporting the Best Practices as a clear articulation of how leading firms can build trust with consumers.
Consumer genetic tests, tests that are marketed to consumers by private companies, have empowered consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle. When consumers expressly grant permission and provide an informed consent, they can choose to share their genetic data with responsible researchers to help them discover important breakthroughs in biomedical research, healthcare, and personalized medicine.
The Best Practices establish standards for genetic data generated in the consumer context by making recommendations for companies’ privacy practices that require:
Detailed transparency about how Genetic Data is collected, used, shared, and retained including a high-level summary of key privacy protections posted publicly and made easily accessible to consumers;
Separate express consent for transfer of Genetic Data to third parties and for incompatible secondary uses;
Educational resources about the basics, risks, benefits, and limitations of genetic and personal genomic testing;
Access, correction, and deletion rights;
Valid legal process for the disclosure of Genetic Data to law enforcement and transparency reporting on at least an annual basis;
Ban on sharing Genetic Data with third parties (such as employers, insurance companies, educational institutions, and government agencies) without consent or as required by law;
Restrictions on marketing based on Genetic Data; and
Strong data security protections and privacy by design, among others.
Supporters of the Best Practices include: Ancestry, 23andMe, Helix, MyHeritage, Habit, African Ancestry, FamilyTreeDNA,* and Living DNA.
The Association for Molecular Pathology (AMP), the premier global, molecular diagnostic professional society, recently revised its official position for all consumer genomic testing. Among other conditions, AMP will support consumer genetic testing if test providers adhere to FPF’s Best Practices (among other requirements): Announcement and Position Statement.
*In January 2019, Family Tree DNA revealed an agreement with the FBI that conflicts with FPF’s Best Practices. FPF immediately removed Family Tree DNA as a supporter.
Future of Privacy Forum and Leading Genetic Testing Companies Announce
Best Practices to Protect Privacy of Consumer Genetic Data
23andMe, Ancestry, Helix, and other leading consumer genetic and personal genomic testing companies back strong protections, including express consent, transparency reports, and strong security requirements
Washington, DC – Today, Future of Privacy Forum, along with leading consumer genetic and personal genomic testing companies 23andMe, Ancestry, Helix, MyHeritage, and Habit, released Privacy Best Practices for Consumer Genetic Testing Services. The Best Practices provide a policy framework for the collection, protection, sharing, and use of Genetic Data generated by consumer genetic testing services. These services are commonly offered to consumers for testing and interpretation related to ancestry, health, nutrition, wellness, genetic relatedness, lifestyle, and other purposes.
“Supporting strong and transparent industry-wide guidelines that provide people with confidence that companies in this growing field will protect their privacy is critical to the continued success of this nascent business sector,” said Jules Polonetsky, CEO, FPF. “That is why we have been working with the industry leaders for the past year to develop privacy and data principles that we and our peers in the personal genomics industry can embrace. We believe that these best practices are essential to engendering trust so that all people can safely access their genetic information.”
Consumer genetic tests, tests that are marketed to consumers by private companies, have empowered consumers to learn more about their biology and take a proactive role in their health, wellness, ancestry, and lifestyle. When consumers expressly grant permission and provide an informed consent, they can choose to share their genetic data with responsible researchers to help support a better understanding of the role of genetic variation in our ancestry, health, well-being, and much more.
“Protecting our customers’ privacy is Ancestry’s highest priority,” said Eric Heath, Chief Privacy Officer, Ancestry. “As a leader in the direct to consumer DNA testing market, Ancestry recognizes the important role that our industry can play in protecting the privacy and data of all customers. We understand the sensitive nature of the information our industry handles and our responsibility as stewards. We are grateful for the Future of Privacy Forum’s leadership in working to get these Best Practices drafted, vetted and aligned, and look forward to seeing these Best Practices broadly adopted across the industry.”
Today, more consumer genetic testing services are available than ever before, prices for testing are becoming increasingly affordable, and the speed at which testing is completed is accelerating. As the industry continues to expand and the technology becomes more accessible, it is vital that the industry acknowledge and address the risks posed to individual privacy when Genetic Data is generated in the consumer context.
“We’re seeing such a rapid progression of the industry, owing to both the advances in technology and the increasing accessibility of genomic information for personal and research use,” said Elissa Levin, Head of Policy and Clinical Affairs, Helix. “We think it’s essential to take a leadership position to continue to grow the industry responsibly, in ways that keep consumer safety at the forefront of action, and pave the way for better experiences and learnings that ultimately help people lead better lives.“
“Everyone who participates in a genetic testing service deserves to have their information protected, no matter which service or product they use. It’s imperative that all consumer genetic testing companies adhere to comprehensive privacy protections, and clearly communicate their policies to consumers in a transparent manner,” said Kate Black, Global Privacy Officer, 23andMe. “With over a decade of experience as a leader in consumer genetic testing, we’ve built incredibly strong privacy practices. We are happy to now work with the industry and an organization like the FPF to solidify best practices, and help ensure proper protection of consumers’ genetic information more broadly.”
The Best Practices are also supported by other consumer genetic testing companies including African Ancestry and FamilyTreeDNA.*
The Best Practices establish standards for genetic data generated in the consumer context by making recommendations for companies’ privacy practices that require:
Detailed transparency about how Genetic Data is collected, used, shared, and retained including a high-level summary of key privacy protections posted publicly and made easily accessible to consumers;
Separate express consent for transfer of Genetic Data to third parties and for incompatible secondary uses;
Educational resources about the basics, risks, benefits, and limitations of genetic and personal genomic testing;
Access, correction, and deletion rights;
Valid legal process for the disclosure of Genetic Data to law enforcement and transparency reporting on at least an annual basis;
Ban on sharing Genetic Data with third parties (such as employers, insurance companies, educational institutions, and government agencies) without consent or as required by law;
Restrictions on marketing based on Genetic Data; and
Strong data security protections and privacy by design, among others.
“The Best Practices recognize that Genetic Data is sensitive information that warrants a high standard of privacy protection,” said Carson Martinez, Policy Fellow, FPF. “Genetic Data may be used to identify predispositions and potential risk for future medical conditions; may reveal information about the individual’s family members, including future children; may contain unexpected information or information of which the full impact may not be understood at the time of collection; and may have cultural significance for groups or individuals. It is therefore critical that the appropriate level of privacy protections is implemented.”
In producing the Best Practices, FPF and privacy leaders at the companies incorporated input from the FTC, a wide variety of genetics experts, and privacy and consumer advocates.
To request comment from FPF or the leading consumer genetic testing companies that were involved with these Best Practices released today, please find contact information below:
The Future of Privacy Forum is a non-profit organization that serves as a catalyst for privacy leadership and scholarship, advancing principled data practices in support of emerging technologies. Learn more about FPF by visiting www.fpf.org.
*In January 2019, Family Tree DNA revealed an agreement with the FBI that conflicts with FPF’s Best Practices. FPF immediately removed Family Tree DNA as a supporter.