Self-Calibrating View-Invariant Gait Biometrics

We present a new method for viewpoint independent gait biometrics. The system relies on a single camera, does not require camera calibration, and works with a wide range of camera views. This is achieved by a formulation where the gait is self-calibrating. These properties make the proposed method p...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on cybernetics 2010-08, Vol.40 (4), p.997-1008
Hauptverfasser: Goffredo, Michela, Bouchrika, Imed, Carter, John N, Nixon, Mark S
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:We present a new method for viewpoint independent gait biometrics. The system relies on a single camera, does not require camera calibration, and works with a wide range of camera views. This is achieved by a formulation where the gait is self-calibrating. These properties make the proposed method particularly suitable for identification by gait, where the advantages of completely unobtrusiveness, remoteness, and covertness of the biometric system preclude the availability of camera information and specific walking directions. The approach has been assessed for feature extraction and recognition capabilities on the SOTON gait database and then evaluated on a multiview database to establish recognition capability with respect to view invariance. Moreover, tests on the multiview CASIA-B database, composed of more than 2270 video sequences with 65 different subjects walking freely along different walking directions, have been performed. The obtained results show that human identification by gait can be achieved without any knowledge of internal or external camera parameters with a mean correct classification rate of 73.6% across all views using purely dynamic gait features. The performance of the proposed method is particularly encouraging for application in surveillance scenarios.
ISSN:1083-4419
2168-2267
1941-0492
2168-2275
DOI:10.1109/TSMCB.2009.2031091