FPF Director of AI & Ethics Testifies Before Congress on Facial Recognition

|

WASHINGTON, D.C. – In a hearing today before the House Committee on Oversight and Reform, Future of Privacy Forum (FPF) Senior Counsel and Director of AI and Ethics Brenda Leong testified on the privacy and ethical implications of the commercial use of facial recognition technology.

“Technology has only accelerated the practice of identification and tracking of people’s movements, whether by governments, commercial businesses, or some combination thereof, leading to the real concerns about an ultimate state of ubiquitous surveillance,” wrote Leong. “How our society faces these challenges will determine how we move further into the conveniences of a digital world, while continuing to embrace our fundamental ideals of personal liberty and freedom.”

In her testimony, Leong emphasized that not every camera-based system is a facial recognition system,” and that the term facial recognition is often broadly and confusingly used in reference to other image-based technology that does not necessarily involve individual identification.

“Understanding how particular image-analysis technology systems work is a critical foundation for effectively understanding and evaluating the risks of facial recognition,” Leong noted in her written testimony. To help educate policymakers, consumers, and others about the varying levels of facial image software and associated benefits and risks, and privacy implications of each, FPF created the infographic, Understanding Facial Detection, Characterization, and Recognition Technologies.

Leong outlined a set of privacy principles created by FPF that should be considered as the foundation of any facial recognition-specific legislation, writing, “consent remains the critical factor, and should be tiered based on the level of personal identification collected or linked, and the associated increasing risk levels.” Leong highlighted that the default standard for consent should be “an “opt-in” or “affirmative consent” model consistent with existing FTC guidelines.

As educational institutions across the country, including colleges and public school districts, consider the use of facial recognition technology on campus, Leong pointed to guidance in the privacy principles that calls for policymakers to: “Give special consideration to the age, sophistication, or degree of vulnerability of those individuals, such as children, in light of the purposes for which facial recognition technology is used, including whether additional levels of transparency, choice, and data security are required.” She also testified that “there is no good justification for the use of facial recognition in a K-12 school.”

In 2019, FPF held a webinar about facial recognition in schools and wrote to the New York State Legislature in support of a well-crafted moratorium on facial recognition systems for security uses in public schools, while cautioning against overly broad bans or language that might have unintended consequences on other security programs.

In her written testimony, Leong cited controversial developments surrounding the implementation of passports and the requirement that they include a photo, resistance to calls for a federally issued national ID card, and REAL ID requirements for state licensing as precedent for policymakers seeking to balance individual rights and freedoms with efficiencies and security.

“These historical discussions reflect the ongoing need to determine the appropriate balance of technological, legal and policy standards and protections, along with the underlying threshold question of whether some systems are simply too high risk to implement regardless of perceived benefits,” wrote Leong.

To read Leong’s written testimony, click here. For an archived livestream of the committee hearing, visit https://oversight.house.gov/. To learn more about the Future of Privacy Forum, visit www.fpf.org and FPF’s facial recognition and biometrics work, including relevant resources, recommended reading, and other materials.

CONTACT: [email protected]