Mind-reading machines are now real, prising open yet another Pandora’s box for ethicists. As usual, there are promises of benefit and warnings of grave peril.
The bright side was front and centre at the Society for Neuroscience annual meeting in Washington DC in November 2017. It was part of a research presentation led by Omid Sani from the University of Southern California.
Sani and his colleagues studied six people with epilepsy who had electrodes inserted into their brains to measure detailed electrical patterns. It is a common technique to help neurosurgeons find where seizures start.
The study asked patients, who can be alert during the procedure, to report their mood during scanning. That allowed the researchers to link the patients’ moods with their brainwave readings. Using sophisticated algorithms, the team claimed to predict patients’ feelings from their brainwaves alone.
That could drive a big shift in the treatment of mental illness, say researchers.
Deep brain stimulation (DBS), where electrodes implanted in the brain give circuits a regular zap, has been successful in Parkinson’s disease. It is also being trialled in depression; but the results, according to a 2017 report in Lancet Psychiatry, are patchy.
Sani and colleagues suggest their discovery could bump up that success rate.
With standard DBS, the patient checks back with the neurologist, sometimes after weeks or months, to tweak the settings as needed. Sani et al’s findings could eliminate that feedback delay. The device would sense mood states as they are changing and automatically zap the brain to bring off-kilter feelings back to baseline.
There is, however, a dark side.
First up, the technique could put mental privacy on the line. The ability to decode mood states raises the spectre of an unscrupulous third party accessing your feelings. DBS data would be priceless for marketers wanting to mould your attitudes with news and advertising.
Then there is the possibility of doing away with negative feelings altogether. Sometimes it is healthy to feel sad – for example, when a loved one dies. Could DBS wipe out ‘good’ sadness as well?
It is worth noting that Sani’s research is funded by the Defense Advanced Research Projects Agency, the research arm of the US military. That is in part because DBS is a potential treatment for post-traumatic stress disorder. A worthy goal, but could it be abused to erase all of a soldier’s unease?
“We have a moral obligation to experience the atrocities of war,” says Adrian Carter, a neuroethicist at Monash University, Melbourne. It follows, says Carter, that we should question the ethics of damping combat distress in its entirety.
There are also deep empirical concerns around the mood-decoding project. Walter Glannon, a philosopher at the University of Calgary in Alberta, Canada, argues that brain recordings can never tell the whole story of how we feel.
“Mental activity is not reducible to brain activity,” he says.
While DBS might change the brain’s electrical readout at once, it can take time to alter feelings and thoughts. The delay, Glannon contends, muddies links between a brain recording and a given mood.
Other researchers are nonetheless upping the decoding stakes.