BCI Commercial and Government Use: Gaming, Education, Employment, and More
This post is the third in a four-part series on Brain-Computer Interfaces (BCIs), providing an overview of the technology, use cases, privacy risks, and proposed recommendations for promoting privacy and mitigating risks associated with BCIs.
Click here for FPF and IBM’s full report: Privacy and the Connected Mind. In case you missed them, read the first and second blog posts in this series. The first post unpacks BCI technology, while the second analyzes BCI applications in healthcare and wellness, the risks associated with these applications, and the implicated legal regimes. Additionally, FPF-curated resources, including policy & regulatory documents, academic papers, thought pieces, and technical analyses regarding brain-computer interfaces are here.
I. Introduction: What are BCIs?
BCIs are computer-based systems that directly record, process, or analyze brain-specific neurodata and translate these data into outputs. Those outputs can be used as visualizations or aggregates for interpretation and reporting purposes and/or as commands to control external interfaces, influence behaviors or modulate neural activity. BCIs can be broadly divided into three categories: 1) those that record brain activity; 2) those that modulate brain activity; or 3) those that do both, also called bi-directional BCIs (BBCIs).
BCIs can be invasive or non-invasive and employ a number of techniques for collecting neurodata and modulating neural signals. Neurodata is data generated by the nervous system, which consists of the electrical activities between neurons or proxies of this activity. This neurodata may be “personal neurodata” if it is reasonably linkable to an individual.
II. BCIs are Entering into the Commercial and Enterprise Market in the Fields of Gaming, Employment, Education, and other Future-Facing Areas.
Gaming: BCIs could augment existing gaming platforms and offer players new ways to play using devices that record and interpret their neural signals. Current examples of BCI gaming combine neurotechnology with existing gaming devices or platforms. These devices attempt to record the user’s electrical impulses, collecting and interpreting the player’s brain signals during play. While most gaming BCIs are single-player, researchers are exploring whether BCIs can provide multiplayer experiences using multi-person non-invasive brain-to-brain interfaces (BBIs). One example of a multiplayer BCI is BrainNet, where three participants exchange neural singles to play a Tetris-like game. BCI can also be applied to augment games on extended reality (XR) devices.
Today’s BCI games are not fully immersive experiences. Players can use neurotechnology to perform only discrete actions. Future BCI games may offer greater immersion by combining neurodata with other biometric and psychological information, which could allow players to control in-game actions using their conscious thoughts.
Employment: BCIs can monitor worker engagement to improve safety, alert workers or supervisors of dangerous situations, and help make operational or employment decisions. Life and AttentivU are examples of BCIs that track and promote worker attentiveness during tasks. These BCIs can also provide notifications when an employee exhibits fatigue or drowsiness. Other employment BCIs measure neurodata to determine a worker’s emotional state. Management could choose to use this neurodata to gauge efficiency, manage workloads, determine worker happiness levels, or make hiring, firing, or promotion decisions.
Employment BCIs can also be used to modulate workers’ brain activity for purposes of improving performance. Transcranial direct current stimulation (tDCS) could be used to promote multitasking with this goal in mind. Invasive BCIs, such as Elon Musk’s Neuralink, are also being evaluated for their potential to increase efficiency during high-pressure and time-sensitive tasks.
Education: BCI technology could be implemented in learning environments to gather student neurodata. This neurodata could reveal whether a student is finding an assignment challenging, which creates opportunities to moderate the amount and level of work, or help teachers and parents assess and improve classroom engagement.
Future-Facing Fields: Smart Cities, Connected Vehicles, and Neuromarketing: BCIs could be applied to augment activities in other contexts. Researchers are exploring the possibility of integrating BCIs into smart cities and communities to enhance public safety, city and transportation efficiency, and energy monitoring. BCIs could also provide new methods for controlling connected vehicles and determining driver attention.
Researchers have used neurotechnology to record physiological and neural signals with varying degrees of accuracy. Recorded neurodata can reveal a consumer’s mood, motivations, and preferences when they buy and use a product or service. Product makers and advertisers can utilize this data to better understand consumer choices.
III. Privacy and Other Risks Associated With BCIs in Gaming, Employment, Education, and Future-Facing Fields: From Profiling to Neurodata-based Decision Making.
BCI applications in these spaces present common and area-specific risks and considerations.
Powered-up Profiling: Gaming and neuromarketing BCIs involve neurodata collection, including user reactions to content in a virtual world. AI and machine learning models can be trained on this neurodata, in combination with other biological changes in response to content, to associate user-specific changes in neural signals to certain physiological states. Neurodata could therefore facilitate the creation of granular profiles on individuals. Since neurodata can capture an individual’s reactions to sensitive content, these profiles may offer intimate portraits into the user’s health, sexual preferences, and even vices.
Organizations could use these profiles to make inferences and decisions. Recognizing this neurodata’s value, organizations collecting and retaining neurodata across sectors may also be incentivized to share or sell it with advertisers. Advertisers could take this information and use it to create more directed behavioral ads, which could encourage unhealthy habits.
Lack of Transparency and Control Over Disclosure: Unlike some other personal information sources, users cannot control the electrical impulses that create neurodata. Whether participating in BCI games or acting online more generally, users are therefore often unaware of neurodata tracking. This means users have less control over personal neurodata flows, which increases the likelihood that this data will be used for purposes unrelated to those it was collected for. Even when a person has control—by requiring opt-in consent, for example—over neurodata monitoring, the individual may feel compelled to share neurodata with someone (e.g., an employer) to avoid retaliation or disparate treatment.
Neurodata-Based Decision Making and BCI Accuracy: The amount and sensitive nature of some neurodata generated in entertainment, employment, education and neuromarketing could inform important decisions. These decisions could impact a person’s life, from the content a user receives in virtual game worlds to whether an employee is promoted or discharged. Concerns about neurodata informing decisions are exacerbated by BCIs collecting inaccurate data. Decisions informed by inaccurate neurodata may contribute to diverse harms, including the perpetuation of feedback loops that fuel societal division.
Chilling Speech and Creating Distrust in Institutions: BCI-enabled monitoring may chill speech and reduce trust in institutions among employees, students, and the general public. Employees who know that they are constantly monitored may place less trust in their employer, lose morale, or refrain from certain behavior. Monitoring may cause students, especially those from communities that have been historically targeted by surveillance or suffer from learning differences, to refrain from certain speech and thoughts in order to avoid retaliation or stigmatization. BCIs incorporated into smart city infrastructure could generate new sources of personal data and enable more invasive surveillance.
IV. Regulations that Might Cover BCIs and Neurodata Include Comprehensive Privacy Laws, Sectoral Privacy Laws, and Self-Regulatory Frameworks.
Comprehensive Privacy Laws and Agency Authority: Both US and foreign comprehensive privacy laws may regulate BCI use and the processing of neurodata. The EU’s General Data Protection Regulation (GDPR) and the California Privacy Rights Act (CPRA) define biometric information broadly, meaning that neurodata may fall within these laws’ scope. However, both laws are framed in terms of whether the data is actually used to or could be used to single out an individual. Concepts such as Brittan Heller’s “biometric psychography”—information from the body used to determine interests, not identity—may not be interpreted as covered, because this information is neither used nor could be used to facilitate identification.
If triggered, the GDPR and CPRA impose obligations on regulated organizations and grant rights to data subjects. Neurodata processing may implicate special rules under these laws. For example, an organization using personal neurodata in marketing would trigger CPRA’s opt out right for “cross-contextual advertising.” While US law generally gives companies significant discretion when writing privacy policies affecting at-will employees, the GDPR indicates that a worker’s consent cannot serve as a lawful basis for processing the employee’s personal data. US Administrative Agencies may also have powers enabling the policing of certain BCI applications. The Federal Trade Commission (FTC) has authority to investigate and enforce penalties against organizations for unfair and deceptive practices, such as those related to advertising, for example.
Sectoral Privacy Laws: The Children’s Online Privacy Protection Act (COPPA) may apply to game operators if they collect, use, or disclose “personal information,” and either target games toward children under 13 or have actual knowledge that such children are using the game. Whether gaming BCIs are regulated under COPPA in part depends on the meaning of “personal information.” Neurodata collected by gaming BCIs could be “personal information” under COPPA if it is considered a “persistent identifier”—a kind of personal information—or if the FTC changes “personal information” to cover biometric data. COPPA gives rights to parents and guardians over their children’s personal information, including access and deletion rights. The statute also imposes obligations on operators, such as obtaining parental consent before collecting information from the child.
Biometric-specific state laws in the US, such as Illinois’ Biometric Information Privacy Act (BIPA), may impact neurodata processing across sectors. Whether these laws apply, however, depends on the meaning of “biometric identifiers.” Under BIPA, this term is important, as it affects what “biometric information” can be based on. While other state biometric laws, such as Washington’s statute, contain broad definitions, BIPA defines “biometric identifier” narrowly to include “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” Neurodata-based information used as an identifier will therefore more likely fall outside of BIPA’s scope, since it is not considered a “biometric identifier.”
BCIs used to monitor workers may implicate employment law. The Electronic Communications Privacy Act (ECPA) limits some types of employee monitoring. However, ECPA permits employers to monitor workplace communications, especially when those conversations take place on company devices like company-owned computers and telephones. Anti-discrimination laws, like the Americans with Disabilities Act (ADA), may stop employers from using BCI results in hiring and firing decisions if the results reflect a disability.
Federal, state, and local student data laws may grant rights to students and parents while imposing requirements on schools and neurotech companies with respect to the processing of personal neurodata. BCI use may be impacted by the Family Educational Rights and Privacy Act (FERPA), which protects education records—including biometric education records—at schools that receive federal funding. A student’s personal neurodata could be part of this record and would therefore receive FERPA protections. These protections include rights for parents and children over 17 years, and obligations on schools. All 50 states and Washington, DC have introduced student privacy legislation, and some could impact BCI use in schools. District and school-level rules may also affect neurodata collection and processing.
Self-regulatory initiatives: Beyond laws and agency enforcement, voluntary self-regulation also impacts the use of BCIs. Neuromarketing is an example of this, where the Neuromarketing Science & Business Association’s (NMSBA’s) Code of Ethics identifies several commitments ranging from consent and transparency that organizations should follow when using BCIs for neuromarketing purposes.
V. Conclusion
Commercial and government BCIs could deliver dividends ranging from novel gaming experiences to more efficient workforces. However, such applications also create privacy risks. While the law could affect how these technologies are used, the scope of existing rules means that certain applications of BCIs are not addressed by current regulatory structures.
Read the next blog post in the series: Technical and Policy Recommendations to Mitigate Privacy Risks