Intelligent prostheses thanks to AI-based intention recognition
Lifting a glass, clenching a fist, typing a phone number with the index finger - state-of-the-art robotic hands can already do amazing things with the help of biomedical technology. But what works in the lab has its limits in everyday life. Because the intentions of the individual human being, his environment and the things in it are too diverse to be able to prescribe them once and for all. A team at Friedrich-Alexander-Universität Erlangen-Nürnberg (FAU) is researching how so-called intelligent prostheses can be further improved and made more reliable.
With the help of interactive artificial intelligence, the prosthesis will learn to recognise the human will more precisely, register its environment - and constantly develop itself further in the process. The project is funded by the EU with 6 million euros, FAU receives 467,000 euros.
"We are literally working at the interface of man and machine," explains Claudio Castellini, Professor of Medical Robotics at FAU. "Upper limb prostheses have developed a lot technologically in the last few decades." With the help of surface electromyography, for example, skin electrodes on the remaining arm stump can record the finest muscle excitations.
These biosignals can be converted and transmitted to the prosthesis as electrical impulses. "The wearer thus controls the hand prosthesis independently with the arm stump. Using methods of pattern recognition and interactive machine learning, the person can also teach the prosthesis his or her individual needs when performing a gesture or movement."
AI instead of cosmetics
But advanced robotic prostheses are not yet optimally mature in terms of comfort, function and control, which is why people with missing limbs often prefer functionless, purely cosmetic prostheses. The new EU Horizon project "AI-Powered Manipulation System for Advanced Robotic Service, Manufacturing and Prosthetics (IntelliMan)" is therefore looking at how these can interact even more effectively and purposefully with their environment.
In particular, the Erlangen researchers are investigating how real, but also virtual prostheses of the upper limbs can be better controlled. The focus is on so-called "intent detection".
To this end, Castellini and his team are further developing the recording and analysis of human biosignals and designing innovative machine learning algorithms to identify a person's individual movement patterns. They are validating their results in user studies on test persons with and without physical limitations.
Between man and machine
The "Assistive Intelligent Robotics" lab (AIROB) at FAU is working on the control of assistive robotics for the upper and lower limbs as well as functional electrical stimulation. "We use the possibilities of intention recognition to control assistive and rehabilitative robotics," explains the scientist. "This includes body-worn robots, such as prostheses and exoskeletons, but also robotic arms and virtual reality simulations."