Motion Forecasting Network (MoFCNet): IMU-Based Human Motion Forecasting for Hip Assistive Exoskeleton
Accurate recognition of human pose and prediction of human motion intention are essential for exoskeleton robots to provide effective assistance. In recent years, inertial measurement units (IMUs) have been widely used to estimate human pose due to their stability and economy. However, physical sens...
Gespeichert in:
Veröffentlicht in: | IEEE robotics and automation letters 2023-09, Vol.8 (9), p.1-8 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Accurate recognition of human pose and prediction of human motion intention are essential for exoskeleton robots to provide effective assistance. In recent years, inertial measurement units (IMUs) have been widely used to estimate human pose due to their stability and economy. However, physical sensors are inherently subject to time delay relative to human movement, which makes it challenging to meet the demand for real-time control of power-assisted exoskeletons. To address this issue, we propose a data-driven approach called motion forecasting network (MoFCNet) that leverages historical IMU readings to forecast the 3D pose of human in future frames. Our approach consists of two stages, state variables forecasting and pose regression. In stage I, we forecast the IMU data, joint positions and velocities. In stage II, we regress the future human poses from the forecasting results of stage I. Besides, we design a basic stack that can be applied to both stages. The mean joint rotation error on DIP-IMU and TotalCapture datasets are 10.34 and 16.11 degrees, respectively. When focusing solely on the first frame of the prediction results, these errors decrease to 9.13 and 14.50 degrees. Experimental results indicate that our method not only outperforms the forecasting baseline methods but also achieves comparable performance to the sparse IMU-based human pose estimation algorithm. |
---|---|
ISSN: | 2377-3766 2377-3766 |
DOI: | 10.1109/LRA.2023.3300279 |