Extending Appearance Based Gait Recognition with Depth Data
Each individual describes unique patterns during their gait cycles. This information can be extracted from the live video stream and used for subject identification. In appearance based recognition methods, this is done by tracking silhouettes of persons across gait cycles. In recent years, there ha...
Gespeichert in:
Veröffentlicht in: | Applied sciences 2019-12, Vol.9 (24), p.5529 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Each individual describes unique patterns during their gait cycles. This information can be extracted from the live video stream and used for subject identification. In appearance based recognition methods, this is done by tracking silhouettes of persons across gait cycles. In recent years, there has been a profusion of sensors that in addition to RGB video images also provide depth data in real-time. When such sensors are used for gait recognition, existing RGB appearance based methods can be extended to get a substantial gain in recognition accuracy. In this paper, this is accomplished using information fusion techniques that combine features from extracted silhouettes, used in traditional appearance based methods, and the height feature that can now be estimated using depth data. The latter is estimated during the silhouette extraction step with minimal additional computational cost. Two approaches are proposed that can be implemented easily as an extension to existing appearance based methods. An extensive experimental evaluation was performed to provide insights into how much the recognition accuracy can be improved. The results are presented and discussed considering different types of subjects and populations of different height distributions. |
---|---|
ISSN: | 2076-3417 2076-3417 |
DOI: | 10.3390/app9245529 |