Most of us don’t think twice when we extend our arms to hug a friend or push a shopping cart—our limbs work together seamlessly to follow our mental commands. For researchers designing brain-controlled prosthetic limbs for people, however, this coordinated arm movement is a daunting technical challenge. A new study showing that monkeys can move two virtual limbs with only their brain activity is a major step toward achieving that goal, scientists say.
The brain controls movement by sending electrical signals to our muscles through nerve cells. When limb-connecting nerve cells are damaged or a limb is amputated, the brain is still able to produce those motion-inducing signals, but the limb can't receive them or simply doesn’t exist. In recent years, scientists have worked to create devices called brain-machine interfaces (BMIs) that can pick up these interrupted electrical signals and control the movements of a computer cursor or a real or virtual prosthetic.
So far, the success of BMIs in humans has been largely limited to moving single body parts, such as a hand or an arm. Last year, for example, a woman paralyzed from the neck down for 10 years commanded a robotic arm to pick up and lift a piece of chocolate to her mouth just by thinking about it. But, "no device will ever work for people unless it restores bimanual behaviors,” says neuroscientist Miguel Nicolelis at Duke University in Durham, North Carolina, senior author of the paper. "You need to use both arms and hands for the simplest tasks.”
In 2011, Nicolelis made waves by announcing on The Daily Show that he is developing a robotic, thought-controlled "exoskeleton" that will allow paralyzed people to walk again. Further raising the stakes, he pledged that the robotic body suit will enable a paralyzed person to kick a soccer ball during the opening ceremony of the 2014 Brazil World Cup. (Nicolelis is Brazilian and his research is partly funded by the nation’s government.)
That feat will require decoding the complex neural signals that coordinate two legs as they walk together and keep a person upright. Now, by successfully training two monkeys to control virtual arms using only their minds, Nicolelis’s team has moved closer to that goal. The new experiment involved two monkeys, a male and a female. Before it began, each monkey had electrodes implanted into its right and left brain hemisphere, which recorded the activity of up to 500 neurons acting together—the highest number of neurons yet used in such an experiment, Nicolelis says. The animal’s task was to control the movement of two avatar arms on a computer monitor: To get a fruit juice reward, it had to place both hands over two circles and hold them there for 100 milliseconds, as demonstrated in the video above. A computer algorithm processed the monkey’s brain activity, homing in on patterns of neurons firing as it learned to do the task.
The female monkey, called monkey C, first learned how to get the juice by moving joysticks with her real arms and hands—as she manipulated the joysticks, the right and left avatar arms did what she wished. After practicing this during regular 20- to 40-minute sessions over the course of a year, she was strapped into a padded chair so that she couldn’t move her own arms or hands, and trained to control the avatar arms just by thinking. After weeks of practice, she was able to complete the task more than 75% of the time, the scientists report today in Science Translational Medicine.
Because a paralyzed person or amputee can't necessarily practice a task using joysticks, the next step was to determine whether observation alone could teach the BMI. Monkey M, a male, wasn't allowed to use the joysticks or move his arms at any point in the experiment—he simply observed the task being performed. It took longer for him to learn, but monkey M also learned to control the virtual arms using only his thoughts. Both animals’ performances improved over time, and the researchers noticed that their neuronal firing patterns changed as this happened, suggesting that their brains were adapting to the BMI devices. This could be because the monkeys came to consider the virtual arms as part of their own bodies, Nicolelis suggests. “The animals literally incorporate the avatar as if the avatar was them.”
The basic technology that Nicolelis and colleagues used to extract instructions for movement from the mishmash of monkey brain signals isn't new, says Jose Contreras-Vidal, a biomedical engineer at the University of Houston in Texas. The real advance of the study, he says, is that the team was able to figure out which neurons they needed to record to control two arms working together. Although one might assume that it would be possible to simply combine neural activity from two arms acting independently, the study shows that cells act differently when they are coordinating the movements of two limbs than they do when separately instructing one limb or the other, he says. This is the first study to extract and use that complex information to coordinate arm movements in real time, Contreras-Vidal says.
Although he agrees that the new study is strong, Andrew Schwartz, a neurobiologist at the University of Pittsburgh in Pennsylvania, thinks scientists can do better. Even though the monkeys’ task was quite simple, they had only about a 45% success rate overall, he notes. “I’m looking forward to higher performance and success rates and more realistic natural movements.”
Nicolelis may have shown that the monkeys can learn to use these avatar arms to complete a one simple task, but it's not clear that the same type of training will work for the more complex activities that humans need to perform, Contreras-Vidal cautions. "This is a first step."
ScienceNOW, the daily online news service of the journal Science