Hybrid partial-constrained learning with orthogonality regularization for unsupervised person re-identification
Person re-identification (re-ID) aims at determining whether there is a specific person in image sets or videos via computer vision technology. State-of-the-art unsupervised re-ID methods extract image features through CNNs-based networks and store these extracted features in memory for identity mat...
Gespeichert in:
Veröffentlicht in: | Engineering applications of artificial intelligence 2023-08, Vol.123, p.106200, Article 106200 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Person re-identification (re-ID) aims at determining whether there is a specific person in image sets or videos via computer vision technology. State-of-the-art unsupervised re-ID methods extract image features through CNNs-based networks and store these extracted features in memory for identity matching. However, extracted global features of these methods ignore the problem of information redundancy and the influence of the constraints between the internal features. To overcome these problems, a Hybrid Partial-constrained Learning (HPcL) network with orthogonality regularization is proposed to learn a discriminative visual representation by generating hybrid features. Specifically, the hybrid features are generated by our designed Dynamic Fusion Module (DFM) to initialize the memory dictionary and match the identity, which can constrain each part of the features extracted by our proposed Multi-Scale (M-S) module and learn robust visual representations. In addition, a new orthogonal regularization method is introduced to constrain orthogonality of the kernel weights and features, which reduces the correlations among features. Extensive experimental results on Market-1501, DukeMTMC-reID, PersonX, and MSMT17 datasets demonstrate that our method is effective and superior to the state-of-the-art methods. |
---|---|
ISSN: | 0952-1976 1873-6769 |
DOI: | 10.1016/j.engappai.2023.106200 |