A Science Nation story published today describes a public-private partnership funded in part by the National Science Foundation (NSF) that is attempting to link mind and machine to ultimately improve the living conditions of those with “locked-in syndrome” — a malady in which people with normal cognitive brain activity suffer severe paralysis, often from injuries or an illness such as Lou Gehrig’s disease.
From the Science Nation article (see a video after the jump!):
“Locked-in people are unable to move at all except possibly their eyes, and so they’re left with no means of communication but they are fully conscious,” says Boston University neuroscientist Frank Guenther.
Guenther works with the NSF’s Center of Excellence for Learning in Education, Science and Technology (CELEST)… Its purpose is to synthesize the experimental modeling and technological approaches to research in order to understand how the brain learns as a whole system…
His team demonstrated two experiments… In one experiment, run by assistant research professor Jonathan Brumberg, a volunteer shows how she uses a speech synthesizer to make vowel sounds just by thinking about moving a hand or foot. She never moves her body or says anything.
“We use an EEG cap to read the brain signals coming from her brain through her scalp,” explains Brumberg, who tracks the brainwaves with a computer. “Depending on what body part she imagines moving, the cursor moves in different directions on the screen. Brumberg explains that he is able to, “translate those brain activities into audio signals that can be used to drive a voice synthesizer. We’ve mapped the “uw” sound to a left hand movement, the “aa” sound to right hand movement, and the “iy” sound to a foot movement…”
Guenther says this technology holds great promise not just for locked-in patients. “We hope these technologies would be applied to people that have other communication disorders that cause them to be unable to speak,” he says. “This sort of thing would allow them to produce synthetic speech, which could be used to talk to the people around them and mention their needs.”
In another experiment, graduate student Sean Lorenz takes a robot out for a spin using only brainwaves. The checkerboards on the sides of the screen flash at slightly different frequencies. To the naked eye, the differences are subtle. “But the neurons in his visual cortex start firing in synchrony with the checkerboard he’s looking at and so we can pick up the frequency and from that, determine which choice he was trying to make, left, right, forward or backward, for example.” explains Guenther.
Check out a video of the mind-reading computer system below…
…and read all about it here.
And by the way, “advances in restorative brain-computer interfaces that are giving paralyzed individuals more effective ways to communicate, move, and interact with their environment” are the subject of a compelling article in this month’s Communications of the ACM (registration required).
(Contributed by Erwin Gianchandani, CCC Director)
Trackbacks /
Pingbacks