On the Cusp of Brain-to-Brain Communication

January 3, 2016

You already know that we can run machines with our brainwaves. That’s been old news for almost a decade, ever since the first monkey fed himself using a robot arm and the power of positive thinking. Nowadays, even reports of human neuroprostheses barely raise an eyebrow.

Brain-computer interfaces have become commonplace in everything from prosthetic vision to video games (a lot of video games; Emotiv and NeuroSky are perhaps the best-known purveyors of Mind Control to the gaming crowd) to novelty cat ears that perk up on your head when you get horny.

But we’ve moved beyond merely thinking orders at machinery. Now we’re using that machinery to wire living brains together. Last year, a team of European neuroscientists headed by Carles Grau of the University of Barcelona reported a kind of – let’s call it mail-order telepathy – in which the recorded brainwaves of someone thinking a salutation in India were emailed, decoded and implanted into the brains of recipients in Spain and France (where they were perceived as flashes of light).

You might also remember breathless reports of a hive mind emerging from the depths of Duke University in North Carolina during the winter of 2013. Miguel Pais-Vieira and his colleagues had wired together the brains of two rats. Present a stimulus to one, and the other would press a lever. The headlines evoked images of one mind reaching into another, commandeering its motor systems in a fit of Alien Paw Syndrome.

Of course, the press goes overboard sometimes. Once you look past those headlines you notice that Reaction Rat had been pre-trained to press his lever whenever he felt a particular itch in his motor cortex (in exactly the same way you’d train him to respond to a flashing light, for example). There was no fused consciousness. It was a step forward, but you don’t get to claim membership in the Borg Collective just because a stimulus happens to tickle you from the inside.

And yet, more recently, Rajesh Rao (of the University of Washington’s Center for Sensorimotor Neural Engineering) reported what appears to be a real Alien Hand Network – and going Pais-Vieira one better, he built it out of people. Someone thinks a command; downstream, someone else responds by pushing a button without conscious intent. Now we’re getting somewhere.

There’s a machine in a lab in Berkeley, California, that can read the voxels right off your visual cortex and figure out what you’re looking at based solely on brain activity. One of its creators, Kendrick Kay, suggested back in 2008 that we’d eventually be able to read dreams (also, that we might want to take a closer look at certain privacy issues before that happened).

His best guess was that this might happen a few decades down the road – but it took only four years for a computer in a Japanese lab to predict the content of hypnagogic hallucinations (essentially, dreams without REM) at 60 per cent accuracy, based entirely on fMRI data.

Read More: Here

0 comment