Despite decades of research, intuitive and robust control of polyarticulated prosthetic hands by amputees is an as-yet unsolved problem, largely due to (a) inadequate sensorization in the hand and in the human-machine interface, and (b) inadequate machine learning methods to detect the intent of the patient. These problems cannot be trivially solved, since prosthetic hands pose severe limitations on weight, price, size, cosmetics and power consumption: they cannot be equipped with standard robotic sensors, and at the same time, a practical, reliable intent detection method is, simply not yet available.
In this project we will employ and evaluate a new generation of tactile sensors coupled with a realistic machine learning schema to overcome both problems. Firstly, we will build a lightweight, wearable human-machine interface based upon high-resolution tactile sensors to augment / substitute traditional surface electromyography. Secondly, we will design and apply a novel, fast, and realistic machine learning method to fully exploit the tactile interface and provide stable intent detection. Thirdly, similar sensor technology will be employed to build a tactile dataglove to sensorize a commercially available stateof-the-art hand prosthesis. As a result of this integration of novel tactile technology and novel machine learning methods, we will significantly advance the state of the art in prosthetic hand control: better grasping and manipulation, higher stability and reliability.
During the project, a pool of upper-limb amputees will be continuously monitored and evaluated when using the developed prosthesis in order to assess its effectiveness and practical usability. Three institutions with a clear record in prosthetics and rehabilitation robotics (DLR), advanced sensors and force / impedance control (CITEC) and machine learning (Idiap) will tightly cooperate towards this aim.