A group of researchers have developed a new kind of human-machine interface that detects individual neuron signals for use in controlling artificial limbs. The group’s findings, published on February 6th in the journal Nature, were reached using a virtual prosthetic rather than a physical device. But the authors say their neuro-electric interface could greatly improve on existing prosthetic control methods, and gave six subjects who had lost arms “intuitive control of multiple degrees of freedom.”
Current prosthetics mostly rely on signals from a user’s muscles, sometimes after reconnecting them to nerves from their missing limbs. But, according to the new paper, that method generates relatively imprecise data.
In the new study, researchers still used muscles as a sort of amplifier, reading their signals with implanted electromyographic sensors. But they filtered the ‘noise’ of muscle activation, giving more direct readings of neuron signals. Then they mapped the nerve signals that would normally control particular motions and used them as inputs for the virtual prosthetic.
This gave test subjects much more accurate and precise control. By one metric, they got an average of 97% signal accuracy compared to between 70% and 85% for existing methods. In addition to enhanced motion accuracy, the use of decoded neural information generally showed more accurate measurement of the force of subjects’ intended movements.
While there is some advanced work on myoelectric sensors that can control prosthetics without being surgically implanted, this new method could offer benefits that outweigh the inconvenience of surgery. The next step will be to demonstrate its effectiveness with a physical prosthesis.