Towards Online Estimation of Human Joint Muscular Torque with a Lower Limb Exoskeleton Robot

Exoskeleton robots demonstrate promise in their application in assisting or enhancing human physical capacity. Joint muscular torques (JMT) reflect human effort, which can be applied on an exoskeleton robot to realize an active power-assist function. The estimation of human JMT with a wearable exosk...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied sciences 2018-09, Vol.8 (9), p.1610
Hauptverfasser: Li, Mantian, Deng, Jing, Zha, Fusheng, Qiu, Shiyin, Wang, Xin, Chen, Fei
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Exoskeleton robots demonstrate promise in their application in assisting or enhancing human physical capacity. Joint muscular torques (JMT) reflect human effort, which can be applied on an exoskeleton robot to realize an active power-assist function. The estimation of human JMT with a wearable exoskeleton is challenging. This paper proposed a novel human lower limb JMT estimation method based on the inverse dynamics of the human body. The method has two main parts: the inverse dynamic approach (IDA) and the sensing system. We solve the inverse dynamics of each human leg separately to shorten the serial chain and reduce computational complexity, and divide the JMT into the mass-induced one and the foot-contact-force (FCF)-induced one to avoid switching the dynamic equation due to different contact states of the feet. An exoskeleton embedded sensing system is designed to obtain the user’s motion data and FCF required by the IDA by mapping motion information from the exoskeleton to the human body. Compared with the popular electromyography (EMG) and wearable sensor based solutions, electrodes, sensors, and complex wiring on the human body are eliminated to improve wearing convenience. A comparison experiment shows that this method produces close output to a motion analysis system with different subjects in different motion.
ISSN:2076-3417
2076-3417
DOI:10.3390/app8091610