Ultrasound imaging as a human-machine interface in a realistic scenario
Medical ultrasound imaging is a widespread highresolution (both spatial and temporal) method to gather live images of the interior of the human body. Its potential as a human-machine interface for the disabled - amputees in particular - is being explored in the rehabilitation robotics community. Fol...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Medical ultrasound imaging is a widespread highresolution (both spatial and temporal) method to gather live images of the interior of the human body. Its potential as a human-machine interface for the disabled - amputees in particular - is being explored in the rehabilitation robotics community. Following up the recent discovery that first-order spatial features of the ultrasound images of the human forearm are linearly related to the hand configuration, we hereby push the approach to a realistic scenario. We show that an extremely simple calibration procedure can be used to obtain a linear regression system which will effectively predict the forces required by a human subject at the fingertips, using live ultrasound images of the forearm. In particular, the system can be trained on minimum and maximum forces only, thereby dramatically shortening the calibration phase; and it will generalise to intermediate force values. This phenomenon is uniform across 5 intact subjects whom we examined in a controlled experiment. Moreover, it is not necessary to use any force sensor, as learning-by-imitation, namely using a visual stimulus, yields similar results. This result is particularly useful in the case of amputees, who normally cannot perform graded-force tasks as proprioception may be lost since decades. Applications of this system include, among others: advanced prosthetics, phantom pain therapy and smart teleoperation. |
---|---|
ISSN: | 2153-0858 2153-0866 |
DOI: | 10.1109/IROS.2013.6696545 |