'New human rights' proposed to fend off thought theft and brain control. Hello, 1984!1
Thankfully the era where scalps are opened without anesthesia is long gone, and we now use much less physically invasive tech to learn about human behaviour, traits and brain rhythms. Details about neuronal activity are now collected (semi-)directly through technologies that can read and monitor brainwaves, and in-depth analysis of yourself is then done through complex algorithms. All that 'chatter' you hear about services like Facebook and Twitter, or devices, such as some of Bose's headphones, alleging records of copious amounts of personal data, is not just chatter, but a multi-billion dollar business. Nowadays, it is barely needed to have a brain scanner attached to your skull for someone to extract cues of what you are thinking, or how you behave. While some of this evolution in research methods is very beneficial in medical and clinical contexts, there are immeasurable privacy risks and human rights concerns.
Researchers duo Marcello Ienca and Roberto Andorno, a neuroethicist and a human rights lawyers, respectively, just published a research paper on the “new human rights in the age of neuroscience and neurotechnology”. The work is part of the Life Sciences, Society and Policy journal and goes into extensive detail about previous and contemporary mechanisms and dangers of brainwave monitoring.
While a reasonable guess would hold that governments may look to implement neuro-monitoring tech into virtuous and humane causes; an educated guess would hold something much different. Ienca and Andorno give us various examples of misuse of brain stimulants or alternators in the past, such as LSD and hypnosis, by our armies and medical institutions. Basic freedoms, in the likes of liberty of thoughts or individual privacy, would need to be protected by a shift in the legal and ethical framework on human rights.
Medical and military research has historically involved abuse of brain-active equipment and substancesThere are currently very few legal rules on unconsented gathering of or access to highly private information. Ienca and Andorno allege that “the indiscriminate leakage of brain data across the infosphere” is only set to expand, with the evolution of ‘brain-enabled’ devices and tech.
New human rights for ‘mind autonomy’
In their work, the two researchers put forward a comprehensive suggestion for a new legal framework to protect the autonomy of brain data.
The first proposition relates to ‘cognitive liberty’ and holds that everyone should be allowed “to alter one’s mental states with the help of neurotools, as well as to refuse to do so”. The right demands that the individual be the only person that generates and participates in the thinking process, without external unconsented influence. The researchers see it as the “neurocognitive substrate of all other liberties”, and there is no specific legal base for protection of such a right at present.
‘Mental integrity’ is the third human right, proposed in the research paper. So, for Harry Potter fans, this would mean protection against the Imperio spell, where someone else is able to take full control of your experiences. Not wands, but chips have been in the making since WWII that would allow for remote control of actions, without the ‘implantee’s’ suspicion.
So…what do we do?
Nothing, we just wait and see what happens. Passing such a transformative piece of legislation through any parliament or congress would take years. Also, even though brain-enabled tech has made some huge leaps in the last 50 years, especially in the last 10, it is still far from mainstream production. It is largely available and developed in controlled research and medical environments, but the time where all our thoughts are linkable to a cloud, or readable through a device, seems to be approaching – steadily and surely. However, be aware that everything you say through WhatsApp, for example, can also be potentially be used to tap into your brain, without the need for a neuro scanner. The buyers of cosmic amounts of Facebook or Twitter user-generated data don’t need it for their in-depth lexicons, you know…