Another frontier in the privacy landscape is emerging, as countries like the U.S. address deficiencies with how sensitive medical data is processed by third parties outside Health Insurance Portability and Accountability Act and other legislative protections.

Around the world, leading neuroscientists, neuroethicists, privacy advocates and legal minds are taking greater interest in brain data and its potential.

Opinions vary widely on the long-term advancements in technology designed to measure brain activity and their impacts on society, as new products trickle out of clinical settings and gain traction for commercial applications.

Some say alarm bells should already be sounding and argue the technology could have corrosive effects on democratic society. Others counter such claims are hyperbolic, given the uncertainty that technology can even measure certain brain activities in the purported way.

Today, neurotechnology is primarily confined to medical and research settings, with the use of various clinical-grade devices to monitor the brain activity of patients who may suffer from mental illnesses or paralysis to gauge muscle movement and record electroencephalography (the measurement of electrical activity and motor function in the brain).

However, neurotechnology is gaining traction in commercial applications as Big Tech companies invest in wearable products that they claim collect a variety of brain data, such as attentiveness and the evaluation of basic emotions. 

The continued development of neurotechnology and debates surrounding its ethical use and potential has spurred an entire conversation around the possible need to protect data potentially gleaned from individuals' brain activity.

Neurotechnology and neurorights 

"I intentionally don’t call this neurorights or brain rights. I call it cognitive liberty," Duke University Law and Philosophy Professor Nita Farahany said during a LinkedIn Live session. "There is promise of this technology, not only for people who are struggling with a loss of speech and loss of motor activity, but for everyday people."

The jumping-off point of the panel centered around Farahany’s new book, "The Battle for Your Brain: The Ability to Think Freely in the Age of Neurotechnology," which examines the neurotechnology landscape and potential negative outcomes without regulatory oversight.

Farahany was motivated to write the book because she saw a "chasm" between what she thought neurotechnology was capable of and the reality of some companies working to one day decode people's inner thoughts on some level.

"I believe (the term cognitive liberty) covers a lot more than just neurotechnology picking up activity from your brain," Farahany said. "When I started (seeing) major tech companies beginning to buy and invest in neural sensors that could become the FitBits for our brain … I started to realize a lot of the things I worried about for a long time … could become a reality."

Smart Cap, a wearable device company measuring brain activity, claims its Life Band technology — which has been introduced within more than 500 companies worldwide — can measure worker fatigue levels in heavy industries like construction or mining.

Farahany said the attention-monitoring neurotechnology has the potential to be less intrusive on individuals' privacy than other so-called "bossware" monitoring tools, such as surveillance cameras or keystroke monitoring. She said this attention-monitoring technology could prove useful as long as appropriate safeguards are in place to ensure the technology only extracts specific data, such as a miner’s fatigue level for safety reasons, without collecting excessive "raw" brain data that could be processed for other purposes.

"The extent to what you can detect from the number of sensors is not incredibly high," she said. "You can detect emotional levels … but the algorithms detecting them have gotten a lot better … so pattern recognition gets better and better all the time at decoding things that are happening in brains."

Farahany said brain data has the potential to be collected "on a spectrum" with how one presents physiologically, like tired or alert, on the less risky end, and "unuttered thoughts" being decoded by future neurotechnologies on the more dangerous end.

Using her spectrum, Farahany said any commercial collection of brain data raises questions about workers' ability to refuse wearing neurotechnology devices, what employers can do with the data they collect and whether more sensitive brain data, such as an employee's inner monologue about their displeasure at work, could get swept up in the deployment of some future wearables.

"In the workplace, I’m particularly worried about it given the (surveillance) dynamics: The inability to refuse (monitoring technology) and the fact it is going to become more ubiquitous and widespread," Farahany said. "Also because even in laws in the United States that seem to provide some protection, they mostly just require disclosure and transparency."

The pace at which neurotechnology creates more raw brain data processing, means the ability to decode that data to uncover a person's inner thoughts may one day become a reality and an actual policy dilemma, instead of a science fiction plot.

There are less than 40 people in the world today using an surgically implanted neurotechnology device to assist with serious health conditions, such as paralysis. However, there are no wearable brain data decoding devices on the consumer market today, Farahany said.

"Every reaction and fatigue level and automatic brain-based thought you have isn’t complex thought," she said. "How we think about how we protect (brain data) is going to be more nuanced, because if you are literally decoding unuttered thoughts from my brain — which, by the way, no neurotechnology right now can do — imagine a world where that is possible. I think of that as complex thought for which freedom of thought ought to apply."

Mind-reading potential?

Other stakeholders in neurotechnology believe the current state of emotional-evaluative technology does not produce meaningful data to meet its intended purposes, and there is no linear path where any kind of mind-reading technology advances to the point of actually extracting one’s inner thoughts.

"Mind reading as it is conceptualized in the popular imagination through science fiction is not possible," Institute of Neuroethics Founder Karen Rommelfanger said in an interview. "Some people will cite studies from mice — that there's something called an implanted memory in the brain — (but that’s) nothing like we've been able to do in humans and may ever be able to do."

Rommelfanger said many of the prognostications about the potential dangers of commercialized neurotechnology stem from future-of-work conversations and the digital gaming world.

"All of these things would rely on … a single type of device, which is something you wear on your head and measures electrical activity,” Rommelfanger said. These devices "don't actually measure brain activity but instead measure muscle activity. Some people may have clinical-grade devices based on (electroencephalography) measurements that might be able to detect more, but they're really limited to what they can detect and some overclaim what they can detect."

Columbia University Biological Sciences and Neuroscience Professor and Neurotechnology Center Director Rafael Yuste is also chairman and co-founder of the Neurorights Foundation. He said his colleagues at Columbia’s Neurotechnology Center have conducted experiments with lab mice using "different types of lasers to imprint patterns of activity in the visual part of the brain."

The nature of the experiments, he said, is to help advance research into therapies for mental illnesses like schizophrenia.

Yuste maintains it is theoretically possible to alter one’s brain activity based on the trajectory of neurotechnology advancements. The future development of consumer neurotechnology products without proper oversight, he warned, could result in a future where individuals’ brain data is bought and sold like patient health data entered by users into health apps outside the reach of HIPAA.

"Let's say a patient is paralyzed and they cannot talk. There’s now a few cases where, with electrodes implanted into the speech area of the brain, you can actually decode what they're thinking," Yuste said in an interview. “Now the problem is the non-implantable devices, the wearables, because wearables are considered consumer electronics and they're not regulated.”

Yuste compared the neurotechnology landscape and the push to secure neurorights from future technologies that could potentially manipulate thought, with today's digital privacy field. He said neurorights advocates see their work as proactive, while the privacy field is reactive to the erosion of privacy brought on by the onset of the internet age.

"We think that from the get-go … the brain should be considered as a special organ and have protection at the level of basic human rights," Yuste said. "Brain data, it's not your typical data because it reflects mental activity, and mental activities are essentially who you are. The best way to protect it is not to wait for technology to happen and then try to mend it, like with (EU General Data Protection Regulation), but to do it in reverse and define the rights of people."

Regulatory considerations

Nonetheless, despite some disagreement on the capacity of neurotechnology products to produce negative outcomes if left unchecked, a common thread among those studying these advancements is the need to establish both hard legislative action and soft law procedures to regulate the advancement of neurotechnology and the sensitive brain data it produces.

Rommelfanger said while the act of creating hard laws around upholding neurorights can be arduous, in international, national and local jurisdictions policymakers can patch potential loopholes where brain data could be left unprotected. For example, in the U.S. data generated through the use of various neurotechnologies could be either protected or not depending on the state.

"You have to think about matching the instrument with the action you're trying to take. A lot of human rights laws take a long time to form, but we have technology that needs attention," Rommelfanger said.

"What we need to do actually is look for gaps in existing regulation and see if we can fill some of those. We’re not as worried about the brain data itself being protected because there are a lot of biometric data laws, but the bigger concern in neurotechnology gaps are the inferences derived from the brain data. These things are not protected. And what that means is there's a potential for discrimination," she continued.

The Future of Privacy Forum CEO Jules Polonetsky, CIPP/US, said any regulations protecting brain data should have clear delineations about what constitutes personal or biometric data. He said a hypothetical government-deployed artificial intelligence system, that could potentially draw inferences from individual’s inner political thoughts and serve as the basis to exclude them from a democratic process, would be bridge too far.

“Data protection is one method of gaining some control… but it does lever on the notion of whether or not the data originates in a way that is personally identifiable,” Polonetsky said during the LinkedIn Live with Duke's Farahany. “On one hand, I want to look to data protection to be the first place to build on the fact that it probably already covers some substantial part of (neurotechnology) … but I do think this is an area where we need to pull the prism back and actually focus on the rights and freedoms we want to protect, whether or not they originate from personal protection-covered data.”

Neurorights in practice

While conversations around neurotechnology, rights and ethics continue to evolve, one country has put upholding individuals’ neurorights at the forefront of its domestic agenda.

In October 2021, deputies in both chambers of Chile's Congress unanimously passed a constitutional amendment mandating “scientific and technological development will be at the service of people and will be carried out with respect for life and physical and mental integrity. The law will regulate the requirements, conditions and restrictions for its use by people, and must especially protect brain activity, as well as the information from it.”

Additionally, Chile passed a stand-alone law in September of 2021 “establishing the rights to personal identity, free will and mental privacy” in an effort “to legislate on neurotechnology that can manipulate one's mind,” according to the Neurorights Foundation. The foundation helped Chilean lawmakers craft the country’s “neuroprotection agenda” along with the Pontificia Universidad Catolica in Santiago.

Yuste, the Neurorights Foundation’s co-founder, said it “wasn’t a complete coincidence” Chile was first to embark on an effort to codify protections for brain data, in part, because of the country's repression under the 17-year dictatorship of Augusto Pinochet ending in 1990.

"Chileans, because of their tragic history, are very sensitize about human rights and they want to see themselves as the forefront of human rights protection in the world," Yuste said.

"They're very concerned also about the impact of medicine and technology on society. The fact that they have an entire committee of the Senate in Chile, called the Committee of the Future, and took (neurorights protections) under their wing by sponsoring this constitutional amendment and the new protection bill demonstrates their commitment."

However, much of what makes Chile a flashpoint in the neurorights movement, also presents the larger neuroscience and privacy communities with more questions than answers in defining the boundaries of the democratic, ethical and medical dynamics of brain data in the wider consumer privacy debate around the world.

Rommelfanger said Chile’s experience ensuring the protection of people's neurorights in law created unintended consequences that could complicate efforts to utilize the cutting edge of neurotechnology to treat individuals who stand to benefit from its advances.

"We learned, surprisingly, digital rights advocacy groups were not in support of this new neurorights bill, and within the constitutional changes, we found out clinicians said, 'With this broad brush that you've used to clearly articulate what neural rights are in a legal document, you might interfere with our ability to actually treat patients,'" Rommelfanger said.

"Actually, what we do need to do is further operationalize existing rights and tailor them to thinking about specific considerations for neurotechnology," she added.