The ReFIT algorithm that decodes these signals represents a departure from earlier models. In most neural prosthetics research, scientists have recorded brain activity while the subject moves or imagines moving an arm, analyzing the data after the fact. "Quite a bit of the work in neural prosthetics has focused on this sort of offline reconstruction," said Gilja, the first author of the paper.
The Stanford team wanted to understand how the system worked "online," under closed-loop control conditions in which the computer analyzes and implements visual feedback gathered in real time as the monkey neurally controls the cursor to toward an onscreen target.
The system is able to make adjustments on the fly when while guiding the cursor to a target, just as a hand and eye would work in tandem to move a mouse-cursor onto an icon on a computer desktop. If the cursor were straying too far to the left, for instance, the user likely adjusts their imagined movements to redirect the cursor to the right. The team designed the system to learn from the user's corrective movements, allowing the cursor to move more precisely than it could in earlier prosthetics.
To test the new system, the team gave monkeys the task of mentally directing a cursor to a target an onscreen dot and holding the cursor there for half a second. ReFIT performed vastly better than previous technology in terms of both speed and accuracy. The path of the cursor from the starting point to the target was straighter and it reached the target twice as quickly as earlier systems, achieving 75 to 85 percent of the speed of real arms.
"This paper reports very exciting innovations in closed-loop decoding for brain-machine interfaces. These innovations should lead to a s
|Contact: Andrew Myers|
Stanford School of Engineering