A new head pose tracking method based on stereo visual SLAM

[Display omitted] •We proposed a head pose tracking method based on stereo visual SLAM for the first time.•We use the SLAM algorithm to the scene where the camera is stationary and the head moves.•We adopt region detection method and skip the initialization phase to reduce the run time. Real-time an...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Journal of visual communication and image representation 2022-01, Vol.82, p.103402, Article 103402
Hauptverfasser: Huang, Suibin, Yang, Kun, Xiao, Hua, Han, Peng, Qiu, Jian, Peng, Li, Liu, Dongmei, Luo, Kaiqing
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:[Display omitted] •We proposed a head pose tracking method based on stereo visual SLAM for the first time.•We use the SLAM algorithm to the scene where the camera is stationary and the head moves.•We adopt region detection method and skip the initialization phase to reduce the run time. Real-time and reliable head pose tracking is the basis of human–computer interaction and face analysis applications. Aiming at the problems of accuracy and real time performance in current tracking method, a new head pose tracking method based on stereo visual SLAM is proposed in this paper. The sparse head map is constructed based on ORB feature points extraction and stereo matching, then the 3D-2D matching relations between 3D mappoints and 2D feature points are obtained by projection matching. Finally, the camera pose solved by the Bundle Adjustment is converted to head pose, which realizes the tracking of head pose. The experimental results show that this method can obtain high precise head pose. The mean errors of three Euler angles are all less than 1°. Therefore, the proposed head pose tracking method can track and estimate precise head pose in real time under smooth background.
ISSN:1047-3203
1095-9076
DOI:10.1016/j.jvcir.2021.103402