Context-Aware Three-Dimensional Mean-Shift With Occlusion Handling for Robust Object Tracking in RGB-D Videos

Depth cameras have recently become popular and many vision problems can be better solved with depth information. But, how to integrate depth information into a visual tracker to overcome the challenges such as occlusion and background distraction is still underinvestigated in current literature on v...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2019-03, Vol.21 (3), p.664-677
Hauptverfasser: Liu, Ye, Jing, Xiao-Yuan, Nie, Jianhui, Gao, Hao, Liu, Jun, Jiang, Guo-Ping
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Depth cameras have recently become popular and many vision problems can be better solved with depth information. But, how to integrate depth information into a visual tracker to overcome the challenges such as occlusion and background distraction is still underinvestigated in current literature on visual tracking. In this paper, we investigate a 3-D extension of a classical mean-shift tracker whose greedy gradient ascend strategy is generally considered as unreliable in conventional 2-D tracking. However, through careful study of the physical property of 3-D point clouds, we reveal that objects which may appear to be adjacent on a 2-D image will form distinctive modes in the 3-D probability distribution approximated by kernel density estimation, and finding the nearest mode using 3-D mean-shift can always work in tracking. Based on the understanding of 3-D mean-shift, we propose two important mechanisms to further boost the tracker's robustness: one is to enable the tracker to be aware of potential distractions and make corresponding adjustments to the appearance model; and the other is to enable the tracker to detect and recover from tracking failures caused by total occlusion. The proposed method is both effective and computationally efficient. On a conventional personal computer, it runs at more than 60 FPS without graphical processing unit acceleration.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2018.2863604