Tag Archives: brain implants and human rights

Next Wild West: Monetizing Mental Data

Some  brain–computer interfaces (BCI) are capable not only to record conscious thoughts but also the impulses of the preconscious. Most BCIs are connected the brain’s motor cortex, the part of the brain that initiates and controls voluntary movements by sending signals to the body’s muscles. But some people have volunteered to have an extra interface implanted in their posterior parietal cortex, a brain region associated with reasoning, attention and planning…The ability of these devices to access aspects of a person’s innermost life, including preconscious thought, raises the stakes on concerns about how to keep neural data private. It also poses ethical questions about how neurotechnologies might shape people’s thoughts and actions — especially when paired with artificial intelligence…

Consumer neurotech products capture less-sophisticated data than implanted BCIs do. Unlike implanted BCIs, which rely on the firings of specific collections of neurons, most consumer products rely on electroencephalography (EEG). This measures ripples of electrical activity that arise from the averaged firing of huge neuronal populations and are detectable on the scalp. Rather than being created to capture the best recording possible, consumer devices are designed to be stylish (such as in sleek headbands) or unobtrusive (with electrodes hidden inside headphones or headsets for augmented or virtual reality).

Still, EEG can reveal overall brain states, such as alertness, focus, tiredness and anxiety levels. Companies already offer headsets and software that give customers real-time scores relating to these states, with the intention of helping them to improve their sports performance, meditate more effectively or become more productive, for example. AI has helped to turn noisy signals from suboptimal recording systems into reliable data, explains Ramses Alcaide, chief executive of Neurable, a neurotech company in Boston, Massachusetts, that specializes in EEG signal processing and sells a headphone-based headset for this purpose…

With regard to EEG, “There’s a wild west when it comes to the regulatory standards”… A 2024 analysis of the data policies of 30 consumer neurotech companies by the Neurorights Foundation, a non-profit organization in New York City, showed that nearly all had complete control over the data users provided. That means most firms can use the information as they please, including selling it.

The government of Chile and the legislators of four US states have passed laws that give direct recordings of any form of nerve activity protected status. But ethicists fear that such laws are insufficient because they focus on the raw data and not on the inferences that companies can make by combining neural information with parallel streams of digital data. Inferences about a person’s mental health, say, or their political allegiances could still be sold to third parties and used to discriminate against or manipulate a person.

“The data economy, in my view, is already quite privacy-violating and cognitive- liberty-violating,” Ienca says. Adding neural data, he says, “is like giving steroids to the existing data economy”.

Excerpt from Liam Drew, Mind-reading devices can now predict preconscious thoughts: is it time to worry?, Nature, Nov. 19, 2025

The Right to Mental Privacy: How AI Can Read You Like a Book

A technique called ‘mind captioning’ described in a scientific paper published on November 5, 2025 generates descriptive sentences of what a person is seeing or picturing in their mind using scans of their brain activity. It is based 1) on artificial intelligence models trained on the text captions of thousands of videos,0 and 2) brain scans of people watching them. The technique could help those with language difficulties to communicate better… But it raises concerns of mental privacy…

Excerpt from Max Kozlov, Mind-captioning’ AI decodes brain activity to turn thoughts into text, Nature, Nov. 5, 2025

Password Prevents Spilling out Private Thoughts

A brain–computer interface (BCI) can decipher the imagined sentences of people who have conditions that interfere with speech — and it comes with password protection to avoid revealing private thoughts. The system begins decoding users’ internal speech only after they think of a specific keyword. This internally spoken “keyword” can enable a user to “lock” and “unlock” the BCI to prevent the broadcasting of their private thoughts or spontaneous ‘self-talk’.

Excerpt from Gemma Conroy, A mind-reading brain implant that comes with password protection, Nature, Aug. 14, 2025

Mass-Market Brain Manipulation and Human Rights

Scientific advances are rapidly making science-fiction concepts such as mind-reading a reality — and raising thorny questions for ethicists, who are considering how to regulate brain-reading techniques to protect human rights such as privacy.

On 13 July, 2023 neuroscientists, ethicists and government ministers discussed the topic at a Paris meeting organized by UNESCO, the United Nations scientific and cultural agency. Delegates plotted the next steps in governing such ‘neurotechnologies’ — techniques and devices that directly interact with the brain to monitor or change its activity. The technologies often use electrical or imaging techniques, and run the gamut from medically approved devices, such as brain implants for treating Parkinson’s disease, to commercial products such as wearables used in virtual reality (VR) to gather brain data or to allow users to control software… Neurotechnology is now a US$33 billion industry.
One area in need of regulation is the potential for neurotechnologies to be used for profiling individuals and the Orwellian idea of manipulating people’s thoughts and behaviour. Mass-market brain-monitoring devices would be a powerful addition to a digital world in which corporate and political actors already use personal data for political or commercial gain.

Commercial devices are of more pressing concern to ethicists. Companies from start-ups to tech giants are developing wearable devices for widespread use that include headsets, earbuds and wristbands that record different forms of neural activity — and will give manufacturers access to that information.

The privacy of this data is a key issue. Rafael Yuste, a neuroscientist at Columbia University in New York City, told the meeting that an unpublished analysis by the Neurorights Foundation, which he co-founded, found that 18 companies offering consumer neurotechnologies have terms and conditions that require users to give the company ownership of their brain data. All but one of those firms reserve the right to share that data with third parties. “I would describe this as predatory,” Yuste says. “It reflects the lack of regulation.”…Another theme at the meeting was how the ability to record and manipulate neural activity challenges existing human rights. Some speakers argued that existing human rights — such as the right to privacy — cover this innovation, whereas others think changes are needed.

Yuste and his colleagues propose five main neurorights: the right to mental privacy; protection against personality-changing manipulations; protected free will and decision-making; fair access to mental augmentation; and protection from biases in the algorithms that are central to neurotechnology.

Excerpt from Liam Drew, Mind-reading machines are coming — how can we keep them in check?, Nature, July 24, 2023