Prosthetic limbs are getting better year by year, but their strength and accuracy may not contribute to ease of use and ability (what they can actually do), especially when people who have had surgery on their limbs can perform only rudimentary movements There are many.
In a promising way investigated by Swiss researchers, AI takes over the parts that cannot be controlled manually.
As a specific example of the problem, consider the case where a person who has cut his arm controls a smart prosthesis on his knee. With signals from sensors attached to the remaining muscles, the prosthetic hand raises its arm fairly easily, leads it to a certain position, and grabs an object on the table.
But what happens next? There aren’t many muscles and tendons that control the fingers. And it doesn’t have the ability to analyze the artificial fingers of the prosthetic hand so that it can bend and stretch as desired by the user. If all you can do is just generic “hold” and “release” instructions, it’s almost impossible to do what you actually did.
That was where researchers from the Swiss Federal Institute of Technology Lausanne (EPcole polytechnique fédérale de Lausanne, EPFL) came in. There is no problem as long as you can find the best way to grip without instructing the next action after instructing the prosthesis to “hold” and “release”. EPFL robotics researchers have been studying “How to find grips automatically” for many years. So they are perfect for solving the problem of the current prosthetic hand.
The prosthetic user tries to perform various movements and grips as well as possible without a real hand and analyzes and trains the muscle signal at that time using a machine learning model. With that basic information, the robot’s hand knows what type of grasp it is currently trying, and by monitoring and maximizing the contact area with the object, the hand can achieve the best grip in real-time. Produce on the spot. A fall prevention mechanism is also provided, and the grip can be adjusted within 0.5 seconds once the slide starts.
As a result, the object will remain firmly and gently while the user basically holds it at his / her will. When you have finished partnering and drinking coffee or moving a piece of fruit from a bowl to a dish, the object is “released” and the system senses this change with a muscle signal. Carry out the act of actually releasing.
Related article: SmartArm ’s AI-powered prosthesis takes the prize at Microsoft ’s Imagine Cup
It reminds me of how the students took Microsoft’s Imagine Cup, and the prosthetic arm with the camera in the palm of the hand gave the objected feedback and taught how to hold it correctly.
On the other hand, this is still experimental, using third-party robot arms and specially unoptimized software. But this “shared control between people and AI” has a potential and may become the foundation for the next generation of smart prosthetics. The team’s research paper is published in Nature Machine Intelligence.