Cooperative object tracking using dual-pan–tilt–zoom cameras based on planar ground assumption
Pan–tilt–zoom (PTZ) cameras play an important role in visual surveillance system. Dual-PTZ camera system is the simplest and most typical one. The superiority of this system lies in that it can obtain both large-view information and high-resolution local-view information of the tracked object at the...
Gespeichert in:
Veröffentlicht in: | IET computer vision 2015-02, Vol.9 (1), p.149-161 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Pan–tilt–zoom (PTZ) cameras play an important role in visual surveillance system. Dual-PTZ camera system is the simplest and most typical one. The superiority of this system lies in that it can obtain both large-view information and high-resolution local-view information of the tracked object at the same time. One method to achieve such task is to use master–slave configuration. One camera (master) tracks moving objects at low resolution and provides the positional information to another camera (slave). Then the slave camera can point towards the object at high resolution and track it dynamically. In this paper, we propose a novel framework exploiting planar ground assumption to achieve cooperative tracking. The approach differs from conventional methods in that we exploit planar geometric constraint to solve the camera collaboration problem. Compared with the existing approach, the proposed framework can be used in the case of wide baseline, and allows the depth change of the tracked object. The proposed method can also adapt to the dynamic change of the surveillance scene. Besides, we also describe a self-calibration method of homography matrix which is induced by the ground plane between two cameras. We demonstrate the effectiveness of the proposed method by testing it with a tracking system for surveillance applications. |
---|---|
ISSN: | 1751-9632 1751-9640 1751-9640 |
DOI: | 10.1049/iet-cvi.2013.0246 |