In George Orwell's dystopian novel "1984," Big Brother watches everyone all the time, no matter what they do or where they are, and nothing is "your own except the few cubic centimeters inside your skull."

With the advent of neurotechnology in recent years, many have started wondering whether we may lose control even over those few cubic centimeters.

UNESCO's global online consultation on the first draft of the Recommendation on the Ethics of Neurotechnology allows anyone to contribute their perspective toward the creation of an international comprehensive framework to address the challenges of neurotechnology while also maximizing its benefits. The deadline for submitting feedback is 12 July.

Neurotechnology's ability to record and intervene on neural activity promises substantial scientific and clinical advantages, but also raises important ethical, legal and societal risks, including potential impacts on freedom of thought, human agency, identity and autonomy. Information about a person's nervous system activity may allow for inferences about their motor activities, perceptions, and cognitive and affective processes.

Protecting the privacy of data collected about the nervous system is, therefore, a prerequisite to address risks associated with the use of neurotechnology.

A recent NeuroRights Foundation study examined privacy practices in the currently small, but fast-growing, consumer neurotechnology market. The study found companies show significant gaps in compliance with international standards across the areas of access to information, data collection and storage, data sharing, user rights, and data safety and security.

For example, 60% of companies surveyed do not provide consumers information on how their neural data is managed or what rights they have concerning it, and over 50% have policies including explicit provisions permitting data sharing.

As artificial intelligence algorithms and data-aggregation tools decode and analyze the increasingly large available collections of highly sensitive neurodata, compromising personal neuroprivacy becomes more likely. Unregulated access to neurodata may result in a series of serious prejudicial uses:  

  • Unfair discrimination may take place when information about neurological characteristics and/or diseases is inferred from neurodata and inappropriately used by, for example, employers or insurers. While the collection and use of neuro data is often protected in the context of medical use, regulations in the context of consumer goods, such as video games, education or self-monitoring gadgets, is still rare.
  • Infringement of mental privacy may occur through the access to mental functions of individuals, for example to memory, reasoning processes or emotions. A particular characteristic of neurotechnology is its potential to access these functions even if an individual has never and may never express them externally.
  • Mental integrity and autonomy refer to one's ability to control their own mental states and functions. While psychoactive substances have long been used to alter mental states, neurotechnology may, in the future, achieve the same with more precision — for example, optogenetics, transcranial direct current stimulation and transcranial magnetic stimulation — and interfere with mental states with or without the individual's knowledge.

Recent research has reported the ability to reconstruct both seen images — with 90.7% accuracy — and imagined images — with 75.6% accuracy using brain images. A variety of other tools have also used neural signals observation to infer communicative acts. It is easy to imagine how these technologies could leave the individual completely vulnerable to accidental or deliberate misuse or misinterpretation of alleged mental states.

Gaps in current regulatory frameworks allow for the unrestricted decoding and commercialization of neurodata. So far, only a handful of nations have approved, or are in the process of approving, laws protecting neural data.

In 2021, Chile amended its constitution to protect brain activity and data. Mexico and Brazil also have advanced processes for similar constitutional amendments while other Latin American countries, as well as Spain, are seeking a variety of neurorights bills. The state of Colorado recently expanded the definition of sensitive data to include neural data, and California and Minnesota are looking into enacting similar protections.

UNESCO's first draft of the recommendation on the ethics of neurotechnology, which is now out for comments, was developed in April 2024 by an ad hoc group of 24 experts and is set for potential adoption in 2025. It defines neurotechnology as "devices and procedures used to understand and/or influence, access, monitor, assess, emulate or modulate the structure and function of the nervous systems of human beings and other animals” and it applies to humans only.

The recommendation defines a set of values, principles and policy actions aimed at guiding "the development and use of neurotechnology in ways that are reliable and for the good of humanity, individuals, communities, societies (including the environment and ecosystems)." Policy actions are expected to support governments in the operationalization and implementation of the recommendation.

The approval of a previous UNESCO recommendation on the ethics of AI by 193 member states has provided a solid basis for international collaboration and the future development of coherent legislation. This new recommendation aims to achieve the same level of support and reach for the ethical use of neurotechnology.

Claudia Roda is professor of computer science, director of the Master of Science in human rights and data science program and former dean at the American University of Paris. She is the UNESCO Chair in AI and Human Rights and serves on the Research Advisory Board of the International Association of Privacy Professionals.