'New human rights' proposed to fend off thought theft and brain control. Hello, 1984!
Corporate giants are already in possession of user-generated data that can be transformed into in-depth analysis of yourself
During the last century, science managed to make gigantic leaps in understanding how the human brain functions. Unfortunately, the majority of the most significant experiments and researches were carried out on Nazi captives, within the dark halls of concentration camps. Cutting through people’s brains and exposing them to highly inhumane stimulations showed us a lot about how we think, memorize, perceive, etc.
Thankfully the era where scalps are opened without anesthesia is long gone, and we now use much less physically invasive tech to learn about human behaviour, traits and brain rhythms. Details about neuronal activity are now collected (semi-)directly through technologies that can read and monitor brainwaves, and in-depth analysis of yourself is then done through complex algorithms. All that 'chatter' you hear about services like Facebook and Twitter, or devices, such as some of Bose's headphones, alleging records of copious amounts of personal data, is not just chatter, but a multi-billion dollar business. Nowadays, it is barely needed to have a brain scanner attached to your skull for someone to extract cues of what you are thinking, or how you behave. While some of this evolution in research methods is very beneficial in medical and clinical contexts, there are immeasurable privacy risks and human rights concerns.
While a reasonable guess would hold that governments may look to implement neuro-monitoring tech into virtuous and humane causes; an educated guess would hold something much different. Ienca and Andorno give us various examples of misuse of brain stimulants or alternators in the past, such as LSD and hypnosis, by our armies and medical institutions. Basic freedoms, in the likes of liberty of thoughts or individual privacy, would need to be protected by a shift in the legal and ethical framework on human rights.
Researchers duo Marcello Ienca and Roberto Andorno, a neuroethicist and a human rights lawyers, respectively, just published a research paper on the “new human rights in the age of neuroscience and neurotechnology”. The work is part of the Life Sciences, Society and Policy journal and goes into extensive detail about previous and contemporary mechanisms and dangers of brainwave monitoring.
Dangers
While a reasonable guess would hold that governments may look to implement neuro-monitoring tech into virtuous and humane causes; an educated guess would hold something much different. Ienca and Andorno give us various examples of misuse of brain stimulants or alternators in the past, such as LSD and hypnosis, by our armies and medical institutions. Basic freedoms, in the likes of liberty of thoughts or individual privacy, would need to be protected by a shift in the legal and ethical framework on human rights.
In-depth data on your behaviour on seemingly innocent and common apps, such as Facebook or Spotify, for example, is not just zeroes and ones in a server. Info about every click and movement in the web transcribes into a profile of what type of person you are. And can then be further augmented into a 'skeleton' of your brain's neural networks. Thanks, Zuckerberg.
In their work, the two researchers put forward a comprehensive suggestion for a new legal framework to protect the autonomy of brain data.
The first proposition relates to ‘cognitive liberty’ and holds that everyone should be allowed “to alter one’s mental states with the help of neurotools, as well as to refuse to do so”. The right demands that the individual be the only person that generates and participates in the thinking process, without external unconsented influence. The researchers see it as the “neurocognitive substrate of all other liberties”, and there is no specific legal base for protection of such a right at present.
The research duo’s second suggestion refers to a right to ’mental privacy’. While your browsing privacy is somewhat tempered with through cookies on websites, try imagining what ‘brain cookies’ would do to basic personal data. If you think that this rule is too out there, let us just remind you that both Android and iOS devices have already been caught up in various personal data controversies.
‘Mental integrity’ is the third human right, proposed in the research paper. So, for Harry Potter fans, this would mean protection against the Imperio spell, where someone else is able to take full control of your experiences. Not wands, but chips have been in the making since WWII that would allow for remote control of actions, without the ‘implantee’s’ suspicion.
The last legal right suggested by Ienca and Andorno is more conceptual than concrete, and speaks about ‘mental continuity’. With the ability of tech to change how your brain perceives and reacts to stuff, the scientists are conscious of the fact that the essence of the individual must be protected as a stable vector. How we experience ourselves may change in simple Deep Brain Stimulation in the course of depression treatment, and a change in personality may be even more imminent, as neuro-sensitive technology advances. We wonder whether Facebook reshaping how we all communicate and feel about ourselves can fall within this category?
Nothing, we just wait and see what happens. Passing such a transformative piece of legislation through any parliament or congress would take years. Also, even though brain-enabled tech has made some huge leaps in the last 50 years, especially in the last 10, it is still far from mainstream production. It is largely available and developed in controlled research and medical environments, but the time where all our thoughts are linkable to a cloud, or readable through a device, seems to be approaching – steadily and surely. However, be aware that everything you say through WhatsApp, for example, can also be potentially be used to tap into your brain, without the need for a neuro scanner. The buyers of cosmic amounts of Facebook or Twitter user-generated data don’t need it for their in-depth lexicons, you know…
Medical and military research has historically involved abuse of brain-active equipment and substances
There are currently very few legal rules on unconsented gathering of or access to highly private information. Ienca and Andorno allege that “the indiscriminate leakage of brain data across the infosphere” is only set to expand, with the evolution of ‘brain-enabled’ devices and tech.New human rights for ‘mind autonomy’
In their work, the two researchers put forward a comprehensive suggestion for a new legal framework to protect the autonomy of brain data.
Back in 2015, the BBC unveiled a mind-controlled TV interface, where you pick what to watch, using only thoughts
‘Mental integrity’ is the third human right, proposed in the research paper. So, for Harry Potter fans, this would mean protection against the Imperio spell, where someone else is able to take full control of your experiences. Not wands, but chips have been in the making since WWII that would allow for remote control of actions, without the ‘implantee’s’ suspicion.
So…what do we do?
Nothing, we just wait and see what happens. Passing such a transformative piece of legislation through any parliament or congress would take years. Also, even though brain-enabled tech has made some huge leaps in the last 50 years, especially in the last 10, it is still far from mainstream production. It is largely available and developed in controlled research and medical environments, but the time where all our thoughts are linkable to a cloud, or readable through a device, seems to be approaching – steadily and surely. However, be aware that everything you say through WhatsApp, for example, can also be potentially be used to tap into your brain, without the need for a neuro scanner. The buyers of cosmic amounts of Facebook or Twitter user-generated data don’t need it for their in-depth lexicons, you know…
source: Life, Sciences and Policy journal via Gizmodo
Things that are NOT allowed: