New research holds promise for a noninvasive brain-computer interface that allows mental control over computers and prosthetics
By Katie Moisse
Past efforts at a BCI to animate an artificial limb involved electrodes inserted directly into the brain. The surgery required to implant the probes and the possibility that implants might not stay in place made this approach risky.
The alternative—recording neural signals from outside the brain—has its own set of challenges. "It has been thought for quite some time that it wasn't possible to extract information about human movement using electroencephalography," or EEG, says neuroscientist and electrical engineer Jose Contreras-Vidal. In trying to record the brain's electrical activity off the scalp, he adds, "people assumed that the signal-to-noise ratio and the information content of these signals were limited."
Evidently, that is not the case. In the March issue of The Journal of Neuroscience, Contreras-Vidal and his team from the bioengineering and kinesiology departments at the University of Maryland, College Park, show that the noisy brain waves recorded using noninvasive EEG can be mathematically decoded into meaningful information about complex human movements. "This means we can use a noninvasive method to develop the next generation of brain–computer interface machines," Contreras-Vidal says. "It can expand considerably the range of clinical and rehabilitative applications."
Instead of undergoing brain surgery, users would wear an electrode-covered head cap that records the electric impulses from neurons—the only mess involved is the clear gel applied to the head to enhance conduction. Some patients have already used the caps to communicate via word processors. (The recognition of a letter flashing on a screen signals the word processor to choose that letter.) The next step is to put the decoded movement information to work. "We hope to show that a person with a stroke or an amputee would be able to control an assistive device," Contreras-Vidal says. He already has healthy volunteers testing two different setups: One has them moving a computer cursor on a screen; the other has them controlling an artificial hand.
Contreras-Vidal also hopes to integrate sensory feedback into the system to optimize the user's control over the device. "In all the studies so far people have used visual feedback to close the loop between the user and the machine," he says. "We think it's important to use other types of feedback, too, because vision is a slow signal" compared with the sensory signal a person would get from an intact limb.
Whether such a system would work for patients with longstanding nerve damage is unknown. Such patients haven't activated their movement-generating neurons or received the related sensory feedback for many years and could generate abnormal brain wave–based movement information. "We're starting to look at patient populations to answer that question," Contreras-Vidal says, naming stroke patients and below-elbow amputees as the first test subjects. "We know the brain is highly redundant, so we think that even if there's a deletion in the brain, we might be able to decode from another place. One advantage to using electroencephalography is access to the whole brain, not just a specific area."