By placing small electrodes on or inside the brain, scientists have developed a method whereby patients are able to interact with computers or control robotic limbs simply by thinking about how to execute those actions.

Designed to improve daily life for those who suffer from paralysis or have lost the ability to speak from a stroke or neurodegenerative disease, University of Washington researchers recently demonstrated that when humans are using the technology, the brain behaves much like it does when completing simple motor tasks, such as kicking a ball or waving. For this reason, they hypothesize that learning to control a robotic arm or prosthetic limb could become second nature for those using it.

“What we’re seeing is that practice makes perfect with these tasks,” Rajesh Rao, a professor of computer science and engineering and a senior researcher involved in the study, said in a press release. “There’s a lot of engagement of the brain’s cognitive resources at the very beginning, but as you get better at the task, those resources aren’t needed anymore and the brain is freed up.”

Along with colleagues Jeffrey Ojemann, a professor of neurological surgery, and Jeremiah Wander, a doctoral student in bioengineering, Rao was able to study seven people with severe epilepsy hospitalized for a monitoring procedure that attempts to identify the origin of the seizure.

During this process, physicians cut through the scalp, drill the skull and place a thin sheet of electrodes directly on top of the brain.

The researchers were able to piggyback on top of the medical procedures and while the physicians watched for signs of a seizure, they asked patients to move a mouse cursor on a computer screen using only their thoughts.

Electrodes on their brains then picked up the signals directing the cursor to move, sending them to an amplifier and then a laptop to be analyzed. Sure enough, within 40 milliseconds, the computer calculated the intentions transmitted through the signal and translated them into the movement of the cursor on the screen.

Through the experiment, researchers found that when patients started the task, a lot brain activity was centered in the prefrontal cortex, an area associated with learning a new skill. However, after as few as 10 minutes, frontal brain activity lessened and the brain signals transitioned to patterns reflective of those seen during automatic actions.

“Now we have a brain marker that shows a patient has actually learned a task,” Ojemann said. “once the signal has turned off, you can assume the person has learned it.

While researchers in the past have experienced success in using brain-computer interfaces in monkeys and humans, the study marks the first to clearly map the neurological signals throughout the brain. As they did so, scientists were surprised at how many parts of the brain were involved.

“We now have a larger-scale view of what’s happening in the brain of a subject as he or she is learning a task,” Rao said. “The surprising result is that even though only a very localized population of cells is used in the brain-computer interface, the brain recruits many other areas that aren’t directly involved to get the job done.”

At this point, several types of brain-computer interfaces are under development and testing, the least invasive of which consists of placing a device on a person’s head that then detects weak electrical signatures of brain activity. And while basic commercial gaming products using this system are on the market, the technology is not very reliable yet due to interference from blinking and muscle movement.

On the other end of the spectrum, a more invasive alternative includes surgically placing electrodes inside the brain tissue itself and recording the activity of individual neurons. This method was demonstrated by researchers at Brown University and the University of Pittsburgh in patients who learned to control robotic arms using the signal directly from their brain.

The UW team, however, simply tested electrodes on the surface of the brain, underneath the skull, allowing them to record brain signals at higher frequencies and with less interference than those taken from the scalp.

Looking down the road, the scientists believe a wireless device could be built to remain inside a person’s head for a longer time and control computer cursors or robotic limbs at home.

“This is one push as to how we can improve the devices and make them more useful to people,” Wander said. “If we have an understanding of how someone learns to use these devices, we can build them to respond accordingly.”

The results of the study were published online in the Proceedings of the National Academy of Sciences and was funded by the National Institutes of Health, the National Science Foundation, the Army Research Office and the Keck Foundation.