Gait-based Person Re-identification: A Survey

The way people walk is a strong correlate of their identity. Several studies have shown that both humans and machines can recognize individuals just by their gait, given that proper measurements of the observed motion patterns are available. For surveillance applications, gait is also attractive, be...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:ACM computing surveys 2019-05, Vol.52 (2), p.1-34, Article 33
Hauptverfasser: Nambiar, Athira, Bernardino, Alexandre, Nascimento, Jacinto C.
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:The way people walk is a strong correlate of their identity. Several studies have shown that both humans and machines can recognize individuals just by their gait, given that proper measurements of the observed motion patterns are available. For surveillance applications, gait is also attractive, because it does not require active collaboration from users and is hard to fake. However, the acquisition of good-quality measures of a person’s motion patterns in unconstrained environments, (e.g., in person re-identification applications) has proved very challenging in practice. Existing technology (video cameras) suffer from changes in viewpoint, daylight, clothing, accessories, and other variations in the person’s appearance. Novel three-dimensional sensors are bringing new promises to the field, but still many research issues are open. This article presents a survey of the work done in gait analysis for re-identification in the past decade, looking at the main approaches, datasets, and evaluation methodologies. We identify several relevant dimensions of the problem and provide a taxonomic analysis of the current state of the art. Finally, we discuss the levels of performance achievable with the current technology and give a perspective of the most challenging and promising directions of research for the future.
ISSN:0360-0300
1557-7341
DOI:10.1145/3243043