Controlling a robotic rehabilitation artefact such as a hand prosthesis is yet a rather open problem. Particularly, the choice of a human-machine interface (HMI) to enable natural control is still debatable. The traditional choice, i.e. surface electromyography (sEMG), suffers from a number of problems (electrode displacement, sweat, fatigue) which cannot be easily solved. One of its main drawbacks is the inherent low spatial resolution, at least in the standard settings. To overcome this hindrance, several novel HMIs have been proposed to substitute or augment sEMG; among them, pressure and tactile sensing, and ultrasound imaging (US). In this paper we propose an advancement towards the usage of US as a HMI for hand prosthetics; namely, we compare traditional US image features with Histograms of Oriented Gradients used as input for three classifiers, and show that a high number of hand configurations and grasping force levels can be classified way above chance level by choosing the right combination of features and classifier. In an experiment involving three intact human subjects, a classification accuracy of 80% was obtained; when classifying three different levels of grip force for four grasps, the performance reduces to 60%. These results confirm the usability of US imaging as a HMI for hand prosthetics, and pave the way to its practical usage as a means of natural prosthetic control.
|Titolo:||Ultrasound imaging for hand prosthesis control: A comparative study of features and classification methods|
|Data di pubblicazione:||2015|
|Appare nelle tipologie:||4.1 Contributo Atti Congressi/Articoli in extenso|