How Advanced Is Mind-Reading Technology?

I don’t know how this topic hasn’t come up more frequently in my conversations. I mean, we all know about fMRI and EEG for studying brain activity, but do we realize these are the first steps towards mind-reading? And do we realize that scientists are actively trying to develop mind-reading technology for a good cause: to help mute people speak, and paralytics move prosthetic limbs? Such advanced technology seems ludicrous, but we’re starting to learn that it’s within the realm of possibility. And suddenly, the one thing you believed would always be private, your mind, becomes unfathomably precious and fragile.

Stephen Hawking has A.L.S. He currently communicates using facial recognition technology–a twitch of his cheek or eyebrow will stop a cursor moving across a keyboard on a computer screen so he can select letters and spell words (this system also has a word prediction algorithm so he doesn’t have to painstakingly choose every letter). But he sometimes wears a headband with an in-development computer chip called the iBrain, allowing it to read his brain waves and learn what signals correspond to certain letters, words or actions. Its developers at NeuroVigil hope that one day it will be able to read the mind of Hawking and others to allow them to speak efficiently and expressively.

BrainGate is another research team endeavoring to decipher brain signals. They’re working on something called “Intracortical brain computer interfaces,” which aim to permit brain control of, among other things, a cursor on a computer screen. If perfected, it would replace Stephen Hawking’s current method of communication. But they have another interesting technology in development, which they hope one day will allow people to naturally control prosthetic limbs the same way that they would control real ones–through a direct link to the motor control region of the brain. BrainGate researchers state on their website: “Using a baby aspirin-sized array of electrodes implanted into the brain, early research from the BrainGate team has shown that the neural signals associated with the intent to move a limb can be ‘decoded’ by a computer in real-time and used to operate external devices.”

In this endeavor, the researchers have already enjoyed incredible success–two stroke victims have been able to control a robotic arm using only their brains. Participant Cathy, who was paralyzed for 15 years prior to this trial, was able to use the arm to raise a bottle of coffee to her lips and drink. But John Donoghue, the leader of the BrainGate2 clinical trial, has emphasized that the technology is far from functional: “Movements right now are too slow and inaccurate — we need to improve decoding algorithms.”

It seems that a company called Battelle, in collaboration with researchers at Ohio State University, has gotten even closer. A quadriplegic named Ian Burkhart is the first person to use Neurobridge, a device that reconnects the brain to muscles without the spinal cord. This happened in April of this year, guys. This is the future. When I first read about it, it sounded like science fiction. We’re here already?? Science has done it??? We’re curing paralysis???? It’s real, but don’t be misled: it doesn’t communicate to the muscles internally. This article posted on the Ohio State University Wexner Medical Center website describes it accurately: “The tiny chip interprets brain signals and sends them to a computer, which recodes and sends them to the high-definition electrode stimulation sleeve that stimulates the proper muscles to execute [Ian’s] desired movements.” Maybe one day this can be made to work internally. But plain and simply, Neurobridge developers have restored hands and hope to a guy who’s been paralyzed for four years because of a diving accident. That’s no small deal.

As you can see, this technology has amazing potential to give many people their lives back. But can you think of some possible negative effects as well? Next time, we’ll discuss the scary implications. If you want to read ahead, here’s a paper by Yale researchers who’ve reconstructed imperfect but impressively recognizable facial images from brain scans of people viewing photographs: Neural portraits of perception: Reconstructing face images from evoked brain activity.

Discussion Topic: These are some pros; brainstorm the cons. Are you excited, scared, or both?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s