Occlusion-Robust Human Tracking with Integrated Multi-View Depth Imagery

In this paper, we present a computer vision-based human tracking system with multiple stereo cameras. Many widely used methods, such as KLT-tracker, update the trackers "frame-to-frame," so that features extracted from one frame are utilized to update their current state. In contrast, we p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEICE transactions on information and systems 2015-01, Vol.E97.D (12), p.3181-3191
Hauptverfasser: Fukushi, Kenichiro, Kumazawa, Itsuo
Format: Artikel
Sprache:jpn
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:In this paper, we present a computer vision-based human tracking system with multiple stereo cameras. Many widely used methods, such as KLT-tracker, update the trackers "frame-to-frame," so that features extracted from one frame are utilized to update their current state. In contrast, we propose a novel optimization technique for the "multi-frame" approach that computes resultant trajectories directly from video sequences, in order to achieve high-level robustness against severe occlusion, which is known to be a challenging problem in computer vision. We developed a heuristic optimization technique to estimate human trajectories, instead of using dynamic programming (DP) or an iterative approach, which makes our method sufficiently computationally efficient to operate in realtime. Six video sequences where one to six people walk in a narrow laboratory space are processed using our system. The results confirm that our system is capable of tracking cluttered scenes in which severe occlusion occurs and people are frequently in close proximity to each other. Moreover, minimal information is required for tracking, instead of full camera images, which is communicated over the network. Hence, commonly used network devices are sufficient for constructing our tracking system.
ISSN:0916-8532
1745-1361
DOI:10.1587/transinf.2014EDP7081