Gait Attribute Recognition: A New Benchmark for Learning Richer Attributes From Human Gait Patterns

Compared to gait recognition, Gait Attribute Recognition (GAR) is a seldom-investigated problem. However, since gait attribute recognition can provide richer and finer semantic descriptions, it is an indispensable part of building intelligent gait analysis systems. Nonetheless, the types of attribut...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on information forensics and security 2024, Vol.19, p.1-14
Hauptverfasser: Song, Xu, Hou, Saihui, Huang, Yan, Cao, Chunshui, Liu, Xu, Huang, Yongzhen, Shan, Caifeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Compared to gait recognition, Gait Attribute Recognition (GAR) is a seldom-investigated problem. However, since gait attribute recognition can provide richer and finer semantic descriptions, it is an indispensable part of building intelligent gait analysis systems. Nonetheless, the types of attributes considered in the existing datasets are very limited. This paper contributes a new benchmark dataset for gait attribute recognition named Multi-Attribute Gait (MA-Gait). Our MA-Gait contains 95 subjects recorded from 12 camera views, resulting in more than 13000 sequences, with 16 attributes labeled, including six attributes that have never been considered in the literature. Moreover, we propose a Multi-Scale Motion Encoder (MSME) to extract robust motion features, and an Attribute-Guided Feature Selection Module (AGFSM) to adaptively capture the most discriminative attribute features from static appearance features and dynamic motion features for different attributes. Our method achieves the best GAR accuracy on the new dataset. Comprehensive experiments show the effectiveness of the proposed method through both quantitative and qualitative evaluations.
ISSN:1556-6013
1556-6021
DOI:10.1109/TIFS.2023.3318934