Privacy Features of iOS 12 and MacOS Mojave

With much media attention focused on new Apple hardware, including new iPhones, Apple also released updated versions of its mobile and desktop operating systems for public download this week. The software upgrades (iOS 12 for iPhones, and macOS 10.14 Mojave for desktop Macs) bring many new features, such as Group FaceTime, options to customize notifications, and aesthetic changes such as an optional desktop “Dark Mode.”

Amidst these upgrades, what’s new for data privacy? Consumers are increasingly aware of privacy issues, and Apple has articulated the company’s commitment to privacy “as a human right.” Meanwhile, regulators are entering the consumer privacy debate, with this week’s Senate hearing bringing further attention to the data practices and policy positions of leading technology companies.

In their Fall updates, Apple improves several existing privacy controls for iPhone users, and MacOS 10.14 brings privacy-focused technical modifications. Several of these updates were first announced at Apple’s June 2018 Worldwide Developer Conference (WWDC), which we discussed (along with major updates to Google’s Android P) earlier this summer, and have now been released to the public.

Below, we provide round-ups of privacy updates in iOS 12, macOS 10.14 (Mojave), and the App Store Review Guidelines.

Privacy Updates in iOS 12 

The following privacy updates are included in iOS 12, which can be downloaded on devices as old as the iPhone 5s and iPad Air.

Privacy Updates in macOS 10.14 (Mojave)

Privacy Updates for Developers (App Store Review Guidelines)

In addition to technical updates to their operating systems, Apple has also made significant changes to its App Store Review Guidelines, the rules for how developers may collect and use personal information from users. These Guidelines, which apply to the third party developers who provide apps through the App Store, can be very influential when paired with robust oversight. In May, Apple began removing apps from the App Store for violations of policies against sharing location data with third-party advertisers without users’ consent. In August, Apple removed apps from the App Store that violated its policies against collecting data to build user profiles or contact databases.

Updates to the Guidelines include:

“Customer trust is the cornerstone of the App Store’s success. Apps should never prey on users or attempt to rip-off customers, trick them into making unwanted purchases, force them to share unnecessary data, raise prices in a tricky manner, charge for features or content that are not delivered, or engage in any other manipulative practices within or outside of the app.” (5.5)

Summary

Amidst new hardware and design features, Apple has introduced important technical updates to iOS 12 and MacOS that aim to empower users to better manage their information. Developers should also take note of significant changes to the App Store Review Guidelines, which help determine the ways in which apps and app partners can collect and handle user data. These changes help provide users with the ability to make more informed decisions that reflect their privacy preferences.

FPF Releases Understanding Facial Detection, Characterization, and Recognition Technologies and Privacy Principles for Facial Recognition Technology in Commercial Applications

Today, FPF publishes the infographic Understanding Facial Detection, Characterization, and Recognition Technologies along with Privacy Principles for Facial Recognition Technology in Consumer Applications.

These resources will help businesses and policymakers better understand and evaluate the growing use of face-based biometric technology systems when used for consumer applications.  Facial recognition technology can help users organize and label photos, improve online services for visually impaired users, and help stores and stadiums better serve customers.  At the same time, the technology often involves the collection and use of sensitive biometric data, requiring careful assessment of the data protection issues raised. Understanding the technology and building trust are necessary to maximize the benefits and minimize the risks.

These Principles define a benchmark of privacy requirements for those commercial situations where technology collects, creates, and maintains a facial template that can be used to identify a specific person – enabling the beneficial applications and services, while providing the necessary protections for individuals.

Government use of facial recognition technology has drawn a great deal of recent attention: in border control, in law enforcement, and in combatting terrorism. Others have published extensive reports on these topics.  Some companies have called for regulation of how governments use these technologies, and others have refused to license the their facial recognition systems to the government until racial disparities and other accuracy challenges are overcome.  Many have criticized the US government’s failure to ensure the fairness of its own biometric systems and protested law enforcement agencies who seek exemptions from long-standing accuracy requirements.

FPF agrees that we need public discussion and thoughtful regulatory action regarding government use of facial recognition technologies.  These important issues deserve careful consideration and action in the context and history of government surveillance, algorithmic fairness, and equity; they are beyond the scope of our Privacy Principles for commercial use.  We do not make recommendations regarding these issues in the publications we release today, but look forward to participation in the important policy discussions with civil society and government that are necessary.

The consumer-facing applications of facial recognition technology continue to evolve, and the technology will certainly be used in new ways in the future.  FPF’s Privacy Principles for Facial Recognition Technology in Consumer Applications includes seven core privacy principles that address the concerns surrounding personally identifiable information (PII) (templates of individual faces) collected by these systems.  We expect these Principles will be used by companies as a resource for the development, refinement, and implementation of facial recognition technology in commercial settings.

The Privacy Principles Include:

As more retailers and private companies increasingly employ various levels of facial scanning technology online and in person, and as applications for photo organizing, tagging, and sharing grow, it is time to push for greater consensus on what the appropriate privacy standards should be for commercial use cases, and what protections consumers should reasonably expect.

Equally relevant is the need to expand stakeholders’ awareness and understanding of the many types of facial scanning systems, as well as the impact of accuracy differences among the many systems available today. It is important to understand the distinctions between facial detection systems (which when properly designed neither create nor implicate any Personally Identifiable Information) with full-scale facial identification programs (matching a person’s image to a database in order to identify the individual to a store clerk or stadium employee who otherwise wouldn’t recognize them).

FPF’s graphic Understanding Facial Detection, Characterization, and Recognition Technologies summarizes the key distinctions between facial scanning technologies for easy reference. Relating each technology to its common use cases, benefits, concerns, and risk of identifiability, we have also outlined the minimum recommended notice and consent requirements and the Operator’s responsibilities.

In the case of facial characterization, for example, no PII is typically retained.  However, characterization technologies can inform businesses whether individuals are smiling or frowning, male or female, and old or young.  That means people may be treated differently – men and women may see different ads displayed on a sign, or there may be customized offers for parents or older individuals – in which case it’s important to understand that the system has been tailored to identify characteristics but not a unique identity. In these cases, notice is the key requirement, as well as attention to ensure there is no discriminatory activity.  But in the examples that do involve creation of facial templates and identification, the guidance calls for increasing levels of consent.

Understanding Facial Detection, Characterization and Recognition Technologies