Robust particle tracking via spatio-temporal context learning and multi-task joint local sparse representation

Particle filters have been proven very successful for non-linear and non-Gaussian estimation problems and extensively used in object tracking. However, high computational costs and particle decadency problem limit its practical application. In this paper, we present a robust particle tracking approa...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Multimedia tools and applications 2019-08, Vol.78 (15), p.21187-21204
Hauptverfasser: Xue, Xizhe, Li, Ying
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Particle filters have been proven very successful for non-linear and non-Gaussian estimation problems and extensively used in object tracking. However, high computational costs and particle decadency problem limit its practical application. In this paper, we present a robust particle tracking approach based on spatio-temporal context learning and multi-task joint local sparse representation. The proposed tracker samples particles according to the confidence map constructed by the spatio-temporal context information of the target. This sampling strategy can ameliorate problems of sample impoverishment and particle degeneracy, target state distribution to obtain robust tracking performance. In order to locate the target more accurately and be less sensitive to occlusion, the local sparse appearance model is adopted to capture the local and structural information of the target. Finally, the multi-task learning where the representations of particles are learned jointly is employed to further improve tracking performance and reduce overall computational complexity. Both qualitative and quantitative evaluations on challenging benchmark image sequences have demonstrated that the proposed tracking algorithm performs favorably against several state-of-the-art methods.
ISSN:1380-7501
1573-7721
DOI:10.1007/s11042-019-7246-8