It has already been successfully tested by three amputees and seven able-bodied people. The designers merged two fields to make it work. Neuroengineering allowed them to decipher intended finger movement from the muscle activity of the amputee’s stump. That provides individual finger control of the prosthetic hand, which has never been done before. Robotics then allow the hand to pick up objects and hold them firmly.
It was developed at the EPFL research institute and university in Lausanne, Switzerland.
Professor Aude Billard, who led the team, said: “When you hold an object in your hand and it starts to slip, you only have a couple of milliseconds to react.
“The robotic hand has the ability to react within 400 milliseconds. With pressure sensors all along the fingers, it can react and stabilise the object before the brain can actually perceive that the object is slipping.”
She said the hand uses an algorithm, which learns how to decode the user’s intention and translates it into finger movement.
The amputee first performs a series of hand movements to train the programme.
Sensors placed on the stump detect muscular activity, and the algorithm learns which hand movements correspond to which muscular activity.
Once the user’s intended finger movements are understood, the information can be used to control individual fingers of the prosthetic hand.
Prof Billard’s colleague Dr Katie Zhuang said: “Because muscle signals can be ‘noisy’, we need a machine-learning algorithm that extracts meaningful activity from those muscles and interprets them.”
Researchers say the algorithm needs further work before it can be used in a widely available prosthetic.
Prof Silvestro Micera, of EPFL, said: “Our approach…could be used in several neuroprosthetic applications such as bionic hand prostheses and brain-to-machine interfaces.”
The findings were published in the journal Nature Machine Intelligence.