Brainwaves and Boundaries: The Privacy Debate with Neurotechnology

March 18, 2025 | Mahsa Gardaneh, JD candidate, Osgoode Hall Law School; M. Imtiaz Karamat, associate, Deeth Williams Wall LLP

Neurotechnology is rapidly advancing into the consumer market with devices like brain-computer interfaces (BCIs) and electroencephalogram (EEG) headsets becoming more accessible to individuals. In addition to offering groundbreaking possibilities, the rapid growth of this technology has triggered alarm bells across the globe regarding whether adequate privacy protections are in place for consumer data. This article explores the rise of neurotechnology, its privacy implications, and global regulatory efforts to address these challenges.

Neurotechnology is reshaping how humans interact with the world by interfacing directly with the brain to monitor, interpret, and influence neural activity. Once confined to the medical sector, neurotechnology has surged into the consumer space, igniting a new era of innovation and societal transformation.

Neurotechnological tools monitor or modify brain and nervous system activity. These devices may capture neural signals using techniques like EEG, which directly measures the electrical signals of neurons; or functional magnetic resonance imaging (fMRI), which indirectly measures brain activity by tracking blood flow. At the forefront of this space are BCIs that connect the brain to external devices and facilitate communication between the user and their environment. For example, certain deep brain stimulators have been known to help patients with Parkinson’s disease regain movement; and, in 2023, a BCI assisted a stroke survivor to speak again.

While medical applications have long dominated neurotechnology, the field is expanding into the consumer sector. For example, Muse, a wearable EEG headset invented by the Canadian company, InteraXon Inc., is being offered to consumers for guided meditation and deep sleep. The device senses the wearer’s brain activity and plays audio feedback to help them achieve their meditation or sleep goal (e.g., playing ocean waves to calm the wearer when distracted during meditation). Another example is the USA company, Neuralink Corp., which aims to merge human cognition with artificial intelligence (AI). In a recent demonstration, a Neuralink brain implant allowed a paralyzed individual to play chess on a laptop using only their thoughts. Companies such as Meta and Apple are also working on their own consumer grade neurotechnology—like earbuds and AR glasses—to monitor brain activity and adapt user experiences in real time. In fact, certain industry reports predict that the worldwide market for neurotechnology products will grow to $17.1 billion by 2026. 

Concerns with Responsible Innovation

At the center of this technology lies neural data, a unique category of information relating to an individual’s nervous system activity with the potential to reveal sensitive insights on the data subject, including mental state, emotions, health, and neural processing. Although this type of information may be useful when applied in the consumer context, the sensitive nature of neural data combined with its rapid proliferation in novel consumer technology raises public concern about unprecedented privacy risks. Unlike other types of personal information, neural data is uniquely connected to an individual’s identity and cannot be easily replaced or updated when compromised. Therefore, there is added pressure to ensure that proper regulatory and industry protections are in place to ensure responsible innovation.

Over-Collection of Neural Data

One of the primary concerns for this space is the over-collection of neural data by neurotechnology providers. Many consumer neurotechnology companies may be using a catch-all approach when collecting neural data from users, which leads to retaining much more information than is necessary for their product’s function. A recent article with the NeuroRights Foundation estimated that consumer neurotechnology companies may only be using 1/10,000th of the original amount of collected information for their product’s function with the surplus information being unnecessarily retained.

Disclosure to Third Parties

Another major issue is the sharing of neural data with third parties. In a review of privacy policies from 30 Neurotechnology company, researchers found that 66% of the companies’ policies mentioned sharing collected data with third parties. This includes sharing for purposes beyond the immediate use of the device, such as marketing, research, or undisclosed commercial objectives. This raises concerns about how third parties might use, repurpose, or monetize this sensitive information that it receives from consumer product companies. Although similar issues may be present in other consumer products that collect data, it is especially a problem with neural data that, when combined with contextual information (e.g., location, application usage, or behavioural patterns) can reveal intimate personal details that are unique to the data, such as emotional responses and cognitive preferences.

Lack of Transparency

Privacy risks are further exacerbated with the potential lack of transparency from neurotechnology companies. In a recent review of these companies by the NeuroRights Foundation, it was found that many companies rely on vague descriptions or simply do not have adequate language to inform consumers of their data handling practices. Additionally, a separate review of privacy policies for neurotechnology companies found that few provide clear assurances about how data is secured, who has access, or what safeguards are in place against breaches.

The above privacy gaps highlight the urgent need for stronger legal guardrails for the growing consumer neurotechnology industry.

What is Canada Doing?

Neurotechnology is on Canada’s radar, but significant regulatory reform is yet to take place. When asked about the matter by the CBC, the Office of the Privacy Commissioner of Canada (OPC) stated that it considers neural data a form of biometric information protected under the Personal Information Protection and Electronic Documents Act, with the scope of this legislation also applying to neural data.

Canadian regulators are also actively researching and developing guidance for the neurotechnology field. The OPC recently held a public consultation on draft guidance for biometric technologies, which is expected to be released soon and, based on the above comment, should encompass neural data. Additionally, Health Canada is in the process of drafting guidelines for neurotechnology innovation, and the Information and Privacy Commissioner of Ontario anticipates releasing a research report on neurotechnology later in 2025.

Recently, neurotechnology was also featured on Canada’s Sensitive Technology List, which covers technologies the Canadian Government considers sensitive and relevant to national security.

Global Action

Outside of Canada, other jurisdictions are making inroads to address concerns for neurotechnology and neural data.

United States of America

On April 17, 2024, Colorado enacted HB 1058, which amended its consumer privacy legislation, the Colorado Privacy Act (CPA), in response to growing neurotechnology privacy risks. Through HB 1058, Colorado recognized the sensitive nature of neural data and the difficulty in obtaining meaningful consent from individuals for the collection, use, and disclosure of neural data in the consumer context. The bill expands the CPA’s definition of “sensitive data” to include “biological data” (referring to data derived from neural properties and, specifically, neural data), thereby expanding the scope of the CPA’s application to businesses that process neural data.

Following Colorado’s lead, on September 28, 2024, California passed a bill to amend the California Consumer Privacy Act (CCPA) to explicitly include “neural data” under its definition of “sensitive personal information”. While the CCPA already referenced biometric information, this amendment ensures that neural data is expressly covered under its scope.

Among the protections offered under the revised state laws, both updated acts impose obligations on businesses to prevent the unauthorized sale or sharing of data generated in connection with neurotechnology; and require them to describe the type of neural data they collect along with its use and disclosure.

Europe

Data protection authorities in Europe are beginning to scrutinize consumer-grade neurotechnology and its potential privacy implications.

Notably, in 2023, the United Kingdom Information Commissioner’s Office (ICO) published a report analyzing neurotechnologies from a regulatory standpoint, including associated use cases and potential privacy risks. More recently, on June 3, 2024, the European Data Protection Supervisor and Spanish data protection authority (the AEPD) released a joint report titled, “TechDispatch on Neurodata”. The report discusses challenges in regulating neural data and raises fundamental questions, such as whether a new category of human rights, called neurorights, should be established. It also highlights that the European Union’s Charter of Fundamental Rights “…already expressly acknowledges the fundamental right to mental integrity (Article 3), as an extension of the fundamental right to human dignity (Article 1), which forms the foundation for privacy rights including protection of personal data (respectively, Article 7 and 8 of the Charter).”

Central and South America

However, Central and South America are clearly the pioneers in this space. Chile made international headlines with the passing of Law No. 21.383 in October 2021, which amended Article 19.1 of the country’s constitution to expressly protect brain activity during technological development. In 2023, Chile’s Supreme Court (the Court) tested this constitutional protection in a landmark decision where the plaintiff, Guido Girardi, alleged that his neural data had been inappropriately collected by a U.S.A.-based company, Emotiv, through its EEG-measuring headset. The device was marketed for detecting users’ internal conditions (e.g., stress), but also used collected neural data for scientific and research purposes—something Girardi claimed he had not consented to. The Court ultimately found in favor of Girardi, ordering Emotiv to erase his data.

Other countries are following Chile’s lead. In December 2023, the Brazilian state of Río Grande do Sul amended its constitution to recognize neurorights. Legislative reform is also underway in Mexico, Costa Rica, Columbia, Argentina, and Uruguay, which are further embracing implementing legal protections for this rapidly growing area.

About the Authors

Mahsa Gardaneh is a JD candidate at Osgoode Hall Law School.

 

 

 

 

Imtiaz Karamat is an associate of Deeth Williams Wall LLP, practicing in the areas of intellectual property, information technology, cybersecurity, technology contracting, privacy and regulatory law.

 

 

 

Any article or other information or content expressed or made available in this Section is that of the respective author(s) and not of the OBA.