Old Laws & New Tech: As Courts Wrestle with Tough Questions under US Biometric Laws, Immersive Tech Raises New Challenges
Extended reality (XR) technologies often rely on users’ body-based data, particularly information about their eyes, hands, and body position, to create realistic, interactive experiences. However, data derived from individuals’ bodies can pose serious privacy and data protection risks for people. It can also create substantial liability risks for organizations, given the growing volume of lawsuits under the Illinois Biometric Information Privacy Act (BIPA) and scrutiny of biometric data practices by the Federal Trade Commission (“FTC” or “Commission”) in their recent Policy Statement. At the same time, there is considerable debate and lack of consensus about what counts as biometric data under existing state privacy laws, creating significant uncertainty for regulators, individuals, and organizations developing XR services.
This blog post explores the intersection of US biometric data privacy laws and XR technologies, particularly whether and to what extent specific body-based data XR devices collect and use may be considered “biometric” under various data protection regimes. We observe that:
- Face templates and iris scans used to authenticate an individual’s identity are regulated biometrics, therefore those use cases in XR are covered by biometric laws.
- Laws with broad definitions of biometrics may apply to systems that use face detection, as seen in emerging case law from Illinois regarding virtual try-on XR applications.
- Organizations have taken steps that reduce their liability risk regarding face-based biometric systems, including by minimizing collection of identifying data or processing biometric data on individuals’ devices.
- Other body-based data not used for identification in XR, like eye-tracking and voice analysis, may also be considered “biometric” if the technology and data are capable of identifying an individual.
A. Face Templates, Hand Scans, and Iris Scans Used to Authenticate an Individual’s Identity Are Regulated Biometrics, Therefore User Authentication in XR is Covered by Biometric and Comprehensive Privacy Laws
In the US, there are three biometric data privacy laws: the Illinois Biometric Information Privacy Act (BIPA), the Texas Capture and Use of Biometric Identifier Act (CUBI), and the Washington Biometric Privacy Protection Act (BPPA). Other states like California, Connecticut, and Virginia also maintain comprehensive data privacy laws that regulate biometric data, and the FTC regulates “unfair and deceptive” biometric data practices. But because BIPA is the only privacy law to have a private right of action for any person “aggrieved” by a violation of a biometric privacy statute, Illinois cases abound and Illinois courts play a fundamental role in determining the scope of what policymakers, companies, and consumers consider to be “biometric data.”
With the exception of CUBI (and to a certain extent, BIPA), most biometric and comprehensive data privacy statutes tie their definitions of “biometric data” to identification, meaning the laws are intended to regulate unique physiological characteristics that entities use to identify an individual. Generally, each biometric and comprehensive law focuses on five forms of biometric data: retina or iris scan, fingerprint, voiceprint, hand scan, and face scan. BIPA, in particular, applies to “biometric identifiers,” defined as a “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry,” as well as “biometric information,” which includes “any information…based on an individual’s biometric identifier used to identify an individual.” Therefore, any entity that uses technology to scan an individual’s iris, finger, hand, or face to uniquely identify an individual (1:many) or authenticate their identity (1:1) must comply with BIPA’s requirements, unless they fall within one of BIPA’s exemptions or exclusions. The same conclusion applies to CUBI, BPPA, and comprehensive data privacy laws.
XR devices often use iris, face, or hand scans to authenticate a user’s identity to log in to their profile or enable in-app payments. Much like computers or smartphones, more than one user may use a single XR device, so authenticating the specific person using the device at a given time allows for more personalization and secure transactions. As a result, iris or face authentication systems in XR devices are likely covered by U.S. biometric and comprehensive data privacy laws. The laws typically require organizations to obtain user consent before enrolling individuals in this sort of biometric XR system, and BIPA has potentially thorny provisions requiring “written consent,” which can be challenging to implement for many XR applications. The face and eye scans that XR technologies use for authentication may also be considered “sensitive data” under comprehensive data privacy laws, such as the California Privacy Rights Act (CPRA) or the Connecticut Data Privacy Act, requiring organizations to provide individuals opt-out rights, including the right to opt out of data sales and other transfers.
Most XR authentication technologies employ live capture of biometrics. Iris, face, or hand scans are captured in real-time when an individual first enrolls, and subsequent scans are likewise captured in real-time when the individual authenticates their identity to the device or app. These scenarios are typically covered by biometrics laws as described above. However, there is some uncertainty regarding biometric laws’ application to XR devices that create a biometric template from non-real-time photos, videos, and audio recordings. Most biometric and comprehensive privacy laws exclude photos, videos, and audio recordings from the scope of “biometric data” to varying degrees (with the exception of CUBI and the CPRA). Utah and Virginia’s comprehensive privacy laws, for example, broadly exempt photographs “or any data generated therefrom” from coverage, making their biometric regulations perhaps less likely to apply to photographic scanning. But case law under BIPA shows that these provisions may not exclude “biometric templates” derived from non-real-time photos, videos, or audio recordings. In Shutterfly v Monroe, the United States District Court for the Northern District of Illinois concluded that narrowly reading “biometric identifier” only to mean real-time scans would swallow the intent of the law, thus photographic scanning to create “templates” were still within scope. Laws like the Connecticut Data Privacy Act (CTDPA) and the final rules for the Colorado Privacy Act (CPA) that do not exclude photos or data generated from these sources if an entity uses them for identification purposes, or CUBI, which contains no exemptions for photographs at all, are likely to follow this analysis. The FTC’s conception of biometric information similarly and explicitly encompasses photos, videos, audio recordings, and certain data derived from these sources, making it likely that most regulators will still consider “biometric templates” created from photographic scanning subject to applicable biometric regulations.
B. Laws with Broad Definitions of Biometrics May Apply to Systems that Use Face Detection, as Seen in Emerging Case Law from Illinois Regarding Virtual Try-On XR Applications
Despite most laws’ goal to regulate biometric data that is uniquely identifying, several statutes’ text can be interpreted to apply to biometric technologies that merely distinguish a face from other objects or analyze facial characteristics, without identifying a particular individual. Depending on a privacy law’s definition of “biometric data,” courts may hold that the term regulates technologies that utilize data derived from an individual’s face, eyes, or voice even when they are not used for identification purposes. In XR, devices may use inward-facing cameras to conduct facial analysis for non-identification purposes, such as rendering expressive avatars. Augmented reality (AR) products like “virtual try-on” may also use facial analysis for people to visualize how different products – like eyeglasses – might look on them. Like many other XR applications, VTO primarily uses facial scans to detect and correctly align the product with an individual’s physical features, rather than for identification purposes.
Some laws with broad definitions can apply to these non-identification technologies unless a specific exception applies. CUBI does not require “biometric identifiers” to uniquely identify an individual, which has prompted the Texas Attorney General to claim that CUBI applies to the capture of face geometry regardless of whether an entity uses these facial maps for individual identification. The FTC’s conception of biometric technologies also broadly encompasses “all technologies that use or purport to use biometric information for any purpose.” But most notably, BIPA is complex because its definition of “biometric identifiers” does not explicitly require that the data be used for identification (in contrast to the statute’s definition of “biometric information,” which does require identification). As a result, Illinois courts have largely found that any facial scan may create a “biometric identifier,” such as with doorbell cameras, photo grouping, and Snapchat filters. This is true even when that technology’s facial scan feature was not used to identify the individual in the photo or video frame.
Recent BIPA lawsuits brought against companies that offer (VTO) illustrate how broad biometric laws might apply to XR devices that use facial analysis. In Theriot v. Louis Vuitton North America, Inc., a federal court permitted BIPA claims to proceed against Louis Vuitton’s VTO sunglasses application, finding that the technology’s use of facial scans was analogous to BIPA case law holding that face scans derived from photographs constitute biometric identifiers. Other VTO cases have had similar outcomes. Only VTO technology used for healthcare-related purposes, such as trying on prescription eyeglasses, have been found by courts to be outside the scope of BIPA. But this result did not rest on BIPA’s overall definition of biometric data, but rather arose from a narrow exception for “information captured from a patient in a health care setting.” So BIPA may not apply to medical providers’ use of XR apps or other immersive technologies, such as brain computer interfaces (BCIs), for diagnostic purposes, but BIPA’s coverage of non-identifying, non-medical uses remains a source of substantial confusion. This confusion undermines individuals’ understanding of their privacy rights and presents liability risks for organizations.
C. Organizations May Reduce their Liability Risk by Minimizing Collection of Identifying Data or Processing Biometric Data on Individuals’ Devices
Some organizations have taken steps to limit their liability risks by minimizing the collection of identifying data or processing biometric data on individuals’ devices. Case law suggests that some facial detection technologies fall outside the scope of BIPA and other biometric regulations if (1) there is no mechanism for the technology to retain facial scans or link scans to a user’s individual identity or account; and/or (2) all of the data is stored on-device.
First, in Daichendt and Odell v. CVS Pharmacy, the Northern District of Illinois dismissed a case against CVS for its passport photosystem, which scans facial geometry in photos to confirm that they meet government requirements for passports (e.g., a person’s eyes are open, their mouth is closed and not smiling, and eyeglasses are not present). The court held that the plaintiffs failed to allege that CVS’ photosystem enabled CVS to determine their identities, nor did the plaintiffs provide CVS “with any information, such as their names or physical or email addresses, that could connect the voluntary scans of face geometry with their identities.”
Separately, in Apple v. Barnett, the Illinois’ appellate court held that Apple was not subject to BIPA requirements regarding their Face ID on iPhone because the company was not “collecting” or “possessing” users’ biometric data since the data was completely stored on the device and never stored on Apple servers. Thus, XR devices that do not retain facial scans that can link to users’ accounts, or only store data on-device (such as Apple’s recently announced Vision Pro) may be out of scope of even some of the broadest biometrics laws.
D. Eye-tracking and Voice Analysis May Also be Considered “Biometric” if the Technology and Data are Capable of Identifying an Individual
In addition to face-based biometric technologies, most XR devices also use other forms of body-based detection or characterization systems for device functionality, such as voice analysis and eye-tracking. As seen with facial detection, these features are developed to detect or create predictions regarding bodily characteristics or behavior, but the subject is typically not identifiable and PII is typically not retained. For example, XR devices often contain microphones to capture a user’s voice and surroundings, which can enable voice commands, verbal interactions with other users, spatial mapping, and realistic sound effects. XR devices may also maintain inward-facing cameras that collect data about a user’s gaze—where they look and for how long—to enable eye tracking. This may be used to improve graphics and allow for more expressive avatars, including avatars that can display microexpressions.
Whether these systems that collect voice or gaze data are covered by biometric or comprehensive data privacy laws may depend on whether an organization can use the technology to identify an individual, even if not used in that capacity. As seen in CVS Pharmacy, many Illinois courts focus on the capacity of the technology to identify an individual. As an initial matter, biometric and comprehensive privacy laws typically apply to “voiceprints,” and not voice recordings. As stated by the Illinois Attorney General, “a voiceprint, which is a record of mechanical measurement, is not the same as a simple recording of a voice.”
However, the line between a voice recording and a voiceprint is blurry, particularly as it relates to the gray area of natural language processing (NLP)—a kind of artificial intelligence (AI) that can use audio to understand, interpret, and manipulate language. In Carpenter v. McDonald’s Corp., the U.S. District Court for the Northern District of Illinois found that McDonald’s drive-through voice assistant technology could be used for identification purposes, and thus could be considered a “voiceprint” under BIPA, since the technology’s patent application states that the technology may capture voice characteristics “like accent, speech pattern, gender, or age for the purpose of training the AI.” In a similar ongoing case against Petco, an Illinois federal judge permitted BIPA claims to proceed regarding employee voice data, stating “[w]hat matters [at the dismissal stage] is not how defendant’s software actually used plaintiffs’ data, but whether the data that Petco gathered was capable of identifying [the employees].” As a result, if an XR device captures vocal characteristics that are capable of unique identification, certain voice data may be considered a “voiceprint” under BIPA. This analytical framework will likely apply to jurisdictions that define biometric data to include biological characteristics that have the potential to identify an individual, such as in the final rules under the Colorado Privacy Act regarding biometric identifiers, or under the FTC’s Policy Statement on Biometric Information.
Whether privacy laws apply to gaze data, however, is even less clear. BIPA lawsuits against online exam proctoring services, autonomous vehicles, and “smart advertising screens” suggest that eye-tracking could be a biometric identifier under BIPA, even if not used for identification. In each of these cases, the technology conducted eye-tracking to determine where a user was looking—whether on the screen, the road, or in the store—but did not identify the individual. Instead, these technologies made inferences about whether someone may be cheating, not paying attention to the road, or what product they were looking at. Plaintiffs in these cases argue that eye-tracking is part of the technology’s collection and analysis of facial geometry, thus making it a “biometric identifier” under BIPA.
Unfortunately, state and federal courts in Illinois have not analyzed whether and to what extent eye tracking, without additional face analysis, constitutes a biometric identifier, nor whether it is a subset of facial analysis. Rather, most cases proceed based solely on the software’s overall facial analysis features, if at all. If courts are prone to equate facial detection scans to “facial geometry,” and voice analysis to “voiceprints,” they may also conflate eye tracking with “a retina or iris scan,” and thus treat eye tracking as a biometric identifier. Or they may follow the BIPA plaintiffs’ analysis, lumping eye-tracking into facial analysis as “facial geometry.” Alternatively, courts could characterize eye tracking as altogether separate from BIPA’s “facial geometry” and “retina or iris scan” categories. In any event, like with voice analysis, if an XR device collects gaze data that could be used for identification purposes, laws with broad biometrics definitions will apply, while other laws that have narrower definitions focused on the data or technology’s current use, may exclude the technology.
Takeaways
Statutory language and court opinions vary in how they define and/or apply to biometric data and identifiers. Though the plain text of most U.S. biometric and comprehensive data privacy laws tie their definition of a “biometric” to the identification of an individual, some laws may be more broadly applied to technologies that use body-based data for non-identification purposes. While most of the body-based data XR collects is not used for identification, litigation brought under BIPA and other state laws suggest that lawmakers and judges may consider certain kinds and uses of such data—for example, AR “facial scans,” eye tracking, and voice— to be biometrics. Whether this will be the case (or continue to be the case) depends on how policymakers draft these laws, and how courts, enforcement bodies, and other parties to litigation interpret statutes regulating biometrics.