Organizations must lead with privacy and ethics when researching and implementing neurotechnology: FPF and IBM Live event and report release
A New FPF and IBM Report and Live Event Explores Questions About Transparency, Consent, Security, and Accuracy of Data
The Future of Privacy Forum (FPF) and the IBM Policy Lab released recommendations for promoting privacy and mitigating risks associated with neurotechnology, specifically with brain-computer interface (BCI). The new report provides developers and policymakers with actionable ways this technology can be implemented while protecting the privacy and rights of its users.
“We have a prime opportunity now to implement strong privacy and human rights protections as brain-computer interfaces become more widely used,” said Jeremy Greenberg, Policy Counsel at the Future of Privacy Forum. “Among other uses, these technologies have tremendous potential to treat people with diseases and conditions like epilepsy or paralysis and make it easier for people with disabilities to communicate, but these benefits can only be fully realized if meaningful privacy and ethical safeguards are in place.”
Brain-computer interfaces are computer-based systems that are capable of directly recording, processing, analyzing, or modulating human brain activity. The sensitivity of data that BCIs collect and the capabilities of the technology raise concerns over consent, as well as the transparency, security, and accuracy of the data. The report offers a number of policy and technical solutions to mitigate the risks of BCIs and highlights their positive uses.
“Emerging innovations like neurotechnology hold great promise to transform healthcare, education, transportation, and more, but they need the right guardrails in place to protect individuals’ privacy,” said IBM Chief Privacy Officer Christina Montgomery. “Working together with the Future of Privacy Forum, the IBM Policy Lab is pleased to release a new framework to help policymakers and businesses navigate the future of neurotechnology while safeguarding human rights.”
FPF and IBM have outlined several key policy recommendations to mitigate the privacy risks associated with BCIs, including:
Rethinking transparency, notice, terms of use, and consent frameworks to empower people around uses of their neurodata;
Ensuring that BCI devices are not allowed for uses to influence decisions about individuals that have legal effects, livelihood effects, or similar significant impacts—such as assessing the truthfulness of statements in legal proceedings; inferring thoughts, emotions or psychological state, or personality attributes as part of hiring or school admissions decisions; or assessing individuals’ eligibility for legal benefits;
Promoting an open and inclusive research ecosystem by encouraging the adoption of open standards for the collection and analysis of neurodata and the sharing of research data with appropriate safeguards in place.
Policymakers and other BCI stakeholders should carefully evaluate how existing policy frameworks apply to neurotechnologies and identify potential areas where existing laws and regulations may be insufficient for the unique risks of neurotechnologies.
FPF and IBM have also included several technical recommendations for BCI devices, including:
Providing hard on/off controls for users;
Allowing users to manage the collection, use, and sharing of personal neurodata on devices and in companion apps;
Offering heightened transparency and control for BCIs that send signals to the brain, rather than merely receive neurodata;
Utilizing best practices for privacy and security to store and process neurodata and use privacy enhancing technologies where appropriate; and
Encrypting sensitive personal neurodata in transit and at rest.
FPF-curated educational resources, policy & regulatory documents, academic papers, thought pieces, and technical analyses regarding brain-computer interfaces are available here.
Read FPF’s four-part series on Brain-Computer Interfaces (BCIs), providing an overview of the technology, use cases, privacy risks, and proposed recommendations for promoting privacy and mitigating risks associated with BCIs.
The Future is Open: The U.S. Turns to Open Banking
FPF is pleased to work with a broad set of stakeholders on concepts around privacy and open banking. For more information on our new Open Banking Working Group and related projects, please contact Jeremy Greenberg: [email protected].
Open banking is a concept that describes banks and other financial institutions, such as credit unions, providing rights to customers over their financial data, including the ability to access, share or port data to third parties for various services.
The inherent tensions found in open banking between privacy, competition, and data portability requirements mirror similar concerns across the spectrum of Big Data.
Current challenges to a widespread and healthy open banking ecosystem in the U.S. include a lack of harmonized rules and principles for maintaining strong privacy protections involving financial data and an absence of standardized technical architecture.
The Consumer Financial Protection Bureau (CFPB) will take the lead on facilitating open banking in the U.S. and crafting rules regarding data protection and security; the CFPB should consider lessons learned from international approaches.
Open banking proponents and policymakers should be mindful of the unique sensitivity of financial information and the complex data protection risks raised by increased sharing of banking data—even when sharing is directed by consumers.
Introduction
In July 2021, President Biden signed the Executive Order on Promoting Competition in the American Economy. The Executive Order takes a “whole of government approach” to enforcing antitrust laws across the economy, with clear implications for data protection and privacy. Notably, the order encourages the Consumer Financial Protection Bureau (CFPB) to consider crafting rules under section 1033 of the Dodd-Frank Act in support of open banking with the goal of making it easier for consumers to safely switch financial institutions and use novel and innovative financial products while maintaining privacy and security.
The Order’s callout signals that the Biden administration views open banking as an important initiative for promoting consumer choice, fostering competition, and protecting consumers’ privacy. The debate around open banking highlights tensions between privacy and competition along with a number of privacy flashpoints including: data portability, access, sharing, transparency, control, and interoperability.
Open Banking Can Provide New Rights and Benefits to Consumers and Help Spur Competition, But Technical and Privacy Challenges Remain
Open banking is a concept that describes banks and other financial institutions, such as credit unions, providing rights to customers over their financial data, including the ability to share data or permissions over their data with third parties for various services. These rights include the right to access their financial data, port their data and switch financial institutions, and grant permission to third parties to carry out transactions and provide financial services to best meet a customer’s needs. For example, individuals could grant access to their financial data to a third party to complete an automated payment or provide tailored financial planning advice based on a consumer’s individual finances or credit history. Proponents of open banking argue that another benefit is increased competition among financial institutions. Firms entering into the financial sector may offer novel services that spur competition across the industry.
One current challenge to a widespread and healthy open banking ecosystem in the U.S. is a lack of harmonized rules and principles for maintaining strong privacy protections involving financial data. As a result, some traditional banking institutions concerned with maintaining strong customer privacy might be hesitant to support an open banking ecosystem that lacks clear and strong privacy rules and principles that equal, or exceed, the current financial privacy and security protections afforded to consumers by regulations such as the Gramm-Leach-Bliley Act (GLBA) or the Fair Credit Reporting Act (FCRA).
Another roadblock to widespread and privacy-protective open banking is the need for standardized technical architecture—particularly interoperable APIs— to enable the safe portability of financial data. A standardized and interoperable API would allow third parties to carry out their services on behalf of customers without accessing certain personal information, such as various login credentials. In the absence of widely adopted secure APIs, third parties sometimes turn to screen scraping to perform services, while collecting customer login credentials and other personal information, leading to potential privacy and security risks such as exposing consumer information in the case of a data breach and consumer impersonation. While current industry efforts such as the Financial Data Exchange’s (FDX) API are underway, a lack of standardized rules and technical standards, such as machine readable file rules, can lead to less privacy-protective methods of third parties accessing data.
The CFPB Will Continue Taking the Lead on Facilitating Open Banking in the U.S, While Considering Lessons Learned from International Approaches
Prior to the Executive Order, the CFPB has taken some preliminary steps to promote safe open banking in the U.S. In 2017, the agency released a set of broad non-binding principlesintended “to help foster the development of innovative financial products and services, increase competition in financial markets, and empower consumers to take greater control over their financial lives.” Key areas of focus include: data access (enabling consumers to obtain financial information in a timely manner without being compelled to share account credentials with third parties); informed consent (in which consumers understand terms & conditions with the ability to readily revoke authorizations granted to third parties); payment authorization (in which third parties are required to obtain specific authorization for distinct activities); efficient and effective accountability mechanisms (incentivizing stakeholders to prevent, detect, and resolve unauthorized access, sharing, and payments); among several other areas.
The CFPB next weighed in on the issue in 2020 when it held the CFPB Symposium: Consumer Access to Financial Records where experts discussed many of the concepts highlighted in the agency’s principles. Following the symposium, in October 2020, the CFPB initiated an Advanced Notice of Proposed Rulemaking on consumer access to financial records and how the agency might develop rules for implementing section 1033 of the Dodd-Frank Act. This is the same rulemaking effort highlighted in the Executive Order. The agency sought comments on the costs and benefits of open banking, and how the agency might handle many of the data protection-related concepts outlined in its 2017 principles, including: access, control, privacy, security, and standard setting. The CFPB has not issued a final rule or concluded the rulemaking, but the agency recently listed data sharing in its current regulatory agenda. Other than the CFPB, The Federal Reserve, Federal Deposit Insurance Corporation (FDIC) and the Office of the Comptroller of the Currency (OCC), released a Proposed Interagency Guidance on Third Party Relationships: Risk Management, focusing on banks managing risks in their third-party relationships with fintech companies, vendors, and other affiliates. While other regulators are involved in this space, the CFPB appears poised to return to their rulemaking effort as a near-term priority for the agency.
While the U.S. is serious about responsibly regulating and setting standards for open banking, other international models are well down this path. In 2015, the EU released an updated Payment Services Directive (PSD2), which went into effect in 2018. PSD2 aims to promote competition, privacy, and data transfer between EU countries and institutions. However, some PSD2 requirements, such as rules around consent, can significantly differ from requirements found in the GDPR and other European laws, leading to a lack of harmonization and confusion for consumers, regulators, and financial institutions. Other leading open banking approaches include recent efforts in the UK, Australia, Brazil, Israel, India, Canada, Mexico, and others. The technical standards and requirements around open banking will likely have to be harmonized between different regimes to promote the international and cross-border nature of the global economy.
Open Banking Highlights Broader Questions about Data Portability, Competition, and Cross-Border Data Flows
While the Executive Order sends a trumpet blast to regulators, consumers, and financial stakeholders that open banking is a priority area for the current administration, many of the data protection themes at play are much broader than open banking and touch multiple industries. The inherent tensions found in open banking between privacy and competition—such as the need to keep data private and in trusted hands vs. new players obtaining access or control over data for various purposes–exists across the spectrum of Big Data. Further, open banking helps animate the current debate and recent interest around data portability requirements from agencies such as the FTC. Ultimately, the need for interoperable rules and technical measures are not only necessary for beneficial and safe open banking, but for other international and cross-border data exchanges.
Brain-Computer Interfaces: Privacy and Ethical Considerations for the Connected Mind
A forthcoming FPF and IBM report focusing on BCI privacy and ethics will be published in November 2021.
FPF-curated educational resources, policy & regulatory documents, academic papers, thought pieces, and technical analyses regarding brain-computer interfaces are available here.
Introduction
Brain-computer interfaces (BCIs) are a prime example of an emerging technology that is spawning new avenues of human-machine interaction. Communication interfaces have developed from the keyboard and mouse to touchscreens, voice commands, and gesture interactions. As computers become more integrated into the human experience, new ways of commanding computer systems and experiencing digital realities have trended in popularity, with novel uses ranging from gaming to education.
Defining BCIs and Neurodata
BCIs are computer-based systems that directly record, process, analyze, or modulate human brain activity in the form of neurodata that is then translated into an output command from human to machine. Neurodata is data generated by the nervous system, composed of the electrical activities between neurons or proxies of this activity. When neurodata is linked, or reasonably linkable, to an individual, it is personal neurodata.
BCI devices can be either invasive or non-invasive. Invasive BCIs are installed directly into—or on top of—the wearer’s brain through a surgical procedure. Today, invasive BCIs are mainly used in the health context. Non-invasive BCIs rely on external electrodes and other sensors or equipment connected to the external surface of the head or body, for collecting and modulating neural signals. Consumer-facing BCIs primarily use various non-invasive methods, including headbands.
Key Applications and Top-of-Mind Privacy and Ethical Challenges
Some BCI implementations raise few, if any, privacy issues. For example, individuals using BCIs to control computer cursors might not not reveal any more personal information than typical mouse users, provided BCI systems promptly discard cursor data. However, some uses of BCI technologies raise important questions about how laws, policies, and technical controls can safeguard inferences about individuals’ brain functions, intents, or emotional states. These questions are increasingly salient in light of the expanded use of BCIs in:
Gaming – where BCIs augment existing gaming platforms and offer players new ways to play using devices that record and interpret their neural signals.
Employment – where BCIs monitor workers’ engagement to improve safety during high-risk tasks, alert workers or supervisors of dangerous situations, modulate workers’ brain activity to improve performance, and provide tools to more efficiently complete tasks.
Education – where BCIs can track student attention, identify students’ unique needs, and alert teachers and parents of student learning progress.
Neuromarketing – where marketers incorporate the use of BCIs to intuit consumers’ moods, and to gauge product and service interest.
Military – where governments are researching the potential of BCIs to help rehabilitate soldiers’ injuries and enhance communication.
It is important for stakeholders in this space to delineate between the current and near future uses and the far-distant notions depicted by science fiction creators. The realistic view of capabilities is necessary to credibly identify urgent concerns and prioritize meaningful policy initiatives. While the potential uses of BCIs are numerous, BCIs cannot at present or in the near future “read a person’s complete thoughts,” serve as an accurate lie detector, or pump information directly into the brain.
As BCIs evolve and are more commercially available across numerous sectors, it is paramount to understand the real risks such technologies pose. BCIs raise many of the same risks posed by home assistants, medical devices, and wearables, but implicate new and heightened risks associated with privacy of thought, resulting from recording, using, and sharing a variety of neural signals. Risks include, but are not limited to:
Collecting, and potentially sharing, sensitive information related to individuals’ private emotions, psychology, or intent;
Combining neurodata with other personal information to build increasingly granular and sensitive profiles about users for invasive or exploitative uses, including behavioural advertising;
Making decisions that significantly impact patients, employees, or students based on information drawn from neurodata (with potential but distinct risks if the conclusions are accurately, or inaccurately drawn);
Security breaches compromising patient health and individual safety and privacy;
A lack of meaningful transparency and personal control over individuals’ neurodata; and
Surveilling individuals based on the collection of sensitive neurodata, especially from historically and heavily surveilled communities.
These technologies also raise important ethical questions around fairness, justice, human rights, autonomy, and personal dignity.
A Mix of Technical and Policy Solutions Is Best for Maximizing Benefits While Mitigating Risks
To promote privacy-protective and ethical uses of BCIs, stakeholders should adopt technical measures including but not limited to:
Providing hard on/off controls whenever possible;
Providing granular user controls on devices and in companion apps for managing the collection, use, and sharing of personal neurodata;
Operationalizing best practices for security and privacy when storing, sharing, and processing neurodata including:
Encrypting sensitive personal neurodata in transit and at rest; and
Embracing appropriate security measures to combat bad actors.
Stakeholders should also adopt policy safeguards including but not limited to:
Rethinking transparency, notice, terms of use, and consent frameworks to empower users with a baseline of BCI literacy around the collection, use, sharing, and retention of their neurodata;
Engaging IRBs, corporate review boards, ethical oversight, and other independent review mechanisms to identify and mitigate risks;
Facilitating participatory and inclusive community input prior to and during BCI development and rollout;
Creating dynamic technical, policy, and employee training standards to account for the gaps in current regulation; and
Promoting an open and inclusive research ecosystem by encouraging the adoption, where possible, of open standards for the collection and analysis of neurodata and the sharing of research data under open licenses and with appropriate safeguards in place.
Conclusion
Because the neurotechnology space is especially future-facing, developers, researchers, and policymakers will have to create best practices and policies that consider existing concerns and strategically prioritize future risks in ways that balance the need for proactive solutions while mitigating misinformation and hype. BCIs will likely augment and complicate many existing technologies that are currently on the market, and privacy professionals will have to stay abreast of recent developments to protect this quickly growing space.
*Image courtesy of Gerd Altmann from Pixabay
Join FPF For XR Week: April 19th-23rd, 2021
Adoption of augmented and virtual reality hardware and software technologies – collectively known as extended reality or “XR” – is taking hold among businesses and individuals. If you’d like to engage in the discussion about the ethical and privacy considerations of XR tech, join our XR Week activities April 19th to 23rd!
After decades of development, demonstrations, and improvements to hardware and software, immersive technologies are increasingly being implemented in education and training, gaming, multimedia, navigation, and communication. Emerging use cases will let individuals explore complicated moral dilemmas or experience a shared digital overlay of the physical world in real time. But XR technologies typically cannot function without collecting sensitive personal information – data that can create privacy risks.
FPF’s XR Week will explore key privacy and ethical questions surrounding augmented reality (AR), virtual reality (VR), and related immersive technologies. The week will feature several events, including a roundtable discussion with expert participants and several conversations hosted in virtual reality.
April 19th, 1:00 – 1:20PM EDT: Reel Virtuality
To kick off XR Week, FPF Policy Counsel and lead on XR technology Jeremy Greenberg and FPF Vice President of Policy John Verdi will discuss a report, Augmented Reality + Virtual Reality: Privacy & Autonomy Considerations in Emerging, Immersive Digital Worlds, to be released on the same day. Greenberg and Verdi will discuss the differences between various immersive technologies, primary use cases, and key privacy and ethical questions. The conversation, originally recorded in Real VR Fishing, can be viewed in 2-D on LinkedIn Live – register for the event on LinkedIn to receive a notification when it begins.
April 21st, 2:00 – 3:30PM EDT: AR + VR: Privacy & Autonomy Considerations for Immersive Digital Worlds
Our featured XR Week event, AR + VR: Privacy & Autonomy Considerations for Immersive Digital Worlds, will include a conversation between FPF Policy Counsel and lead on XR technology Jeremy Greenberg, and Facebook Reality Labs Director of Policy James Hairston. A panel, moderated by Greenberg, will discuss the recorded conversation. Panelists will include:
Ana Lang, Senior Vice President, General Counsel, Magic Leap.
Joe Jerome, Director of Platform Accountability & State Advocacy, Common Sense Media.
Jessica Outlaw, Behavioral Scientist, The Extended Mind.
April 22nd, 1:00 – 1:10PM EDT: Sculpting XR Compliance
On the Thursday of XR Week, Greenberg and BakerHostetler Data Protection Attorney Carolina Alonso will discuss the legal compliance challenges associated with XR technologies. The conversation, originally recorded in SculptrVR, can be viewed in 2-D on LinkedIn Live – register for the event on LinkedIn.