Gait Recognition With Drones: A Benchmark
Gait recognition aims to obtain people's identity through body shape and walking posture. Existing gait recognition studies focus on low vertical view recognition, in which the person and the camera are nearly at the same height. Differently, in this work, we focus on gait recognition at high v...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on multimedia 2024, Vol.26, p.3530-3540 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Gait recognition aims to obtain people's identity through body shape and walking posture. Existing gait recognition studies focus on low vertical view recognition, in which the person and the camera are nearly at the same height. Differently, in this work, we focus on gait recognition at high vertical views. To facilitate the research, we propose a new dataset named DroneGait, where the drones are used to collect the gait data. This dataset contains 22 k sequences of 96 subjects taken at different vertical views, varying from about 0^{\circ } to 80^{\circ }. Furthermore, we evaluate the effectiveness of several state-of-the-art appearance-based and skeleton-based models using our dataset and establish comprehensive baselines. Our results demonstrate that the dataset is challenging and presents significant opportunities to improve existing gait recognition methods. Moreover, we propose a new method called Vertical Distillation, which is based on the feature distillation across different vertical views. Our proposed method substantially outperforms the state-of-the-art models on DroneGait at high vertical views. Cross-vertical-view and cross-domain experiments are also made to explain the importance of gait recognition at high vertical views. Furthermore, we analyze the differences between gait recognition at different vertical views using heatmap visualization techniques. We will make our dataset and code publicly available upon acceptance. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2023.3312931 |