Coupled Patch Alignment for Matching Cross-View Gaits

Gait recognition has attracted growing attention in recent years, as the gait of humans has a strong discriminative ability even under low resolution at a distance. Unfortunately, the performance of gait recognition can be largely affected by view change. To address this problem, we propose a couple...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on image processing 2019-06, Vol.28 (6), p.3142-3157
Hauptverfasser: Ben, Xianye, Gong, Chen, Zhang, Peng, Jia, Xitong, Wu, Qiang, Meng, Weixiao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Gait recognition has attracted growing attention in recent years, as the gait of humans has a strong discriminative ability even under low resolution at a distance. Unfortunately, the performance of gait recognition can be largely affected by view change. To address this problem, we propose a coupled patch alignment (CPA) algorithm that effectively matches a pair of gaits across different views. To realize CPA, we first build a certain amount of patches, and each of them is made up of a sample as well as its intra-class and inter-class nearest neighbors. Then, we design an objective function for each patch to balance the cross-view intra-class compactness and the cross-view inter-class separability. Finally, all the local-independent patches are combined to render a unified objective function. Theoretically, we show that the proposed CPA has a close relationship with canonical correlation analysis. Algorithmically, we extend CPA to "multi-dimensional patch alignment" that can handle an arbitrary number of views. Comprehensive experiments on CASIA(B), USF, and OU-ISIR gait databases firmly demonstrate the effectiveness of our methods over other existing popular methods in terms of cross-view gait recognition.
ISSN:1057-7149
1941-0042
DOI:10.1109/TIP.2019.2894362