Biometric technology has long been used for security and law enforcement purposes such as national security watch lists, passport controls, criminal fingerprint databases, and immigration processing. Now, however, the private sector increasingly uses these systems as a verification method for authentication that previously required a PIN or password. Apple’s decision to include a fingerprint scanner in the iPhone in 2013 brought new public awareness to possible non-law-enforcement applications of biometric technologies, and the company’s shift to facial recognition access in the most recent models further normalized the concept. Biometric technology continues to be adopted in many sectors, including financial services, transportation, health care, computer systems and facility access, and voting. In many cases, this technology is more efficient, less expensive, and easier to use than traditional alternatives, while also eliminating the need for passwords, which are broadly recognized as an insufficiently secure safeguard for user data. However, as with any digital system, there are privacy concerns around the collection, use, storage, sharing, and analysis of the data that are generated by these systems.
Lawyers are trained to respond to risks that threaten the market position or operating capital of their clients. However, when it comes to AI, it can be difficult for lawyers to provide the best guidance without some basic technical knowledge. This article shares some key insights from our shared experiences to help lawyers feel more at ease responding to AI questions when they arise.
BCIs are computer-based systems that directly record, process, analyze, or modulate human brain activity in the form of neurodata that is then translated into an output command from human to machine. Neurodata is data generated by the nervous system, composed of the electrical activities between neurons or proxies of this activity. When neurodata is linked, or reasonably linkable, to an individual, it is personal neurodata.
Digital identity systems vary in complexity. At its most basic, a digital ID would simply recreate a physical ID in a digital format, whereasa fully integrated digital identity system would provide a platform for a complete wallet and verification process, usable both online and in the physical world.
By Jeremy Greenberg, [email protected] and Katelyn Ringrose [email protected] Key FPF-curated background resources – policy & regulatory documents, academic papers, and technical analyses regarding brain-computer interfaces are available here. Recently, Elon Musk livestreamed an update for Neuralink—his startup centered around creating brain-computer interfaces (BCIs). BCIs are an umbrella term for devices that detect, amplify, and translate […]
Authors: Hannah Schaller, Gabriela Zanfir-Fortuna, and Rachele Hendricks-Sturrup Around the world, governments, companies, and other entities are either using or planning to rely on thermal imaging as an integral part of their strategy to reopen economies. The announced purpose of using this technology is to detect potential cases of COVID-19 and filter out individuals in […]
By Stacey Gray, Pollyanna Sanderson, and Katelyn Ringrose Download a printable version of this report (pdf). As Congress continues to work toward drafting and passing a comprehensive national privacy law, state legislators are not slowing down. In Washington State, a new comprehensive privacy law is moving quickly: last week, the Washington Privacy Act (SSB 6281) […]
On Friday, June 14, FPF submitted a letter to the New York State Assembly and Senate supporting a well-crafted moratorium on facial recognition systems for security uses in public schools.
By Michelle Bae and Jeremy Greenberg Privacy professionals seeking clarity on compliance with the California Consumer Privacy Act (CCPA) are monitoring numerous amendment bills introduced in the California State Assembly and the California State Senate. Twelve bills garnered the votes needed to pass the Assembly and moved to the Senate for further revision and voting. […]
Today, the Future of Privacy Forum submitted comments to the Washington State Senate Ways & Means Committee on the proposed Washington Privacy Act, Senate Bill 5376. FPF takes a “neutral” position regarding the Bill, and makes a few important points. FPF commends the Bill’s sponsors for addressing a broad set of individual data protection rights. […]
Understanding AI and its underlying algorithmic processes presents new challenges for privacy officers and others responsible for data governance in companies ranging from retailers to cloud service providers. In the absence of targeted legal or regulatory obligations, AI poses new ethical and practical challenges for companies that strive to maximize consumer benefits while preventing potential harms.