Patient-Specific Pose Estimation in Clinical Environments

Reliable posture labels in hospital environments can augment research studies on neural correlates to natural behaviors and clinical applications that monitor patient activity. However, many existing pose estimation frameworks are not calibrated for these unpredictable settings. In this paper, we pr...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE journal of translational engineering in health and medicine 2018-01, Vol.6, p.1-11
Hauptverfasser: Chen, Kenny, Gabriel, Paolo, Alasfour, Abdulwahab, Gong, Chenghao, Doyle, Werner K., Devinsky, Orrin, Friedman, Daniel, Dugan, Patricia, Melloni, Lucia, Thesen, Thomas, Gonda, David, Sattar, Shifteh, Wang, Sonya, Gilja, Vikash
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Reliable posture labels in hospital environments can augment research studies on neural correlates to natural behaviors and clinical applications that monitor patient activity. However, many existing pose estimation frameworks are not calibrated for these unpredictable settings. In this paper, we propose a semi-automated approach for improving upper-body pose estimation in noisy clinical environments, whereby we adapt and build around an existing joint tracking framework to improve its robustness to environmental uncertainties. The proposed framework uses subject-specific convolutional neural network models trained on a subset of a patient's RGB video recording chosen to maximize the feature variance of each joint. Furthermore, by compensating for scene lighting changes and by refining the predicted joint trajectories through a Kalman filter with fitted noise parameters, the extended system yields more consistent and accurate posture annotations when compared with the two state-of-the-art generalized pose tracking algorithms for three hospital patients recorded in two research clinics.
ISSN:2168-2372
2168-2372
DOI:10.1109/JTEHM.2018.2875464