Robust head and hands tracking with occlusion handling for human machine interaction

This paper presents a head and hands tracking method with a monocular camera for human machine interaction (HMI). The targets are tracked independently when they are far from each other, however, they are merged with dependent likelihood measurements in higher dimension while they are likely to inte...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Bor-Jeng Chen, Cheng-Ming Huang, Ting-En Tseng, Li-Chen Fu
Format: Tagungsbericht
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:This paper presents a head and hands tracking method with a monocular camera for human machine interaction (HMI). The targets are tracked independently when they are far from each other, however, they are merged with dependent likelihood measurements in higher dimension while they are likely to interrupt each other. When tracking one target in the independent situation, other targets are masked to decrease the disturbances of skin color on the tracked one. Multiple clues, including the combination of the locally discriminative color weighted image and the back-projection image of the reference color model, the motion history image and the gradient orientation feature, are employed to verify the hypotheses originated from the particle filter. On the other hand, when the head and hands are closing or even overlapping, the multiple importance sampling (MIS) particle filter generates the tracking hypotheses of merged targets by the skin blob mask and the depth order estimation. These merged hypotheses are then evaluated by the visual cues of occluded face template, hand shape orientation and motion continuity. The experimental results present the real-time efficiency and the robustness in comparison with the OpenNI tracker which has been released recently for the Kinect sensor.
ISSN:2153-0858
2153-0866
DOI:10.1109/IROS.2012.6386113