Multi-Branch Feature Alignment Network for Misaligned and Occluded Person Re-Identification

As a pivotal computer vision technique, person re-identification (re-ID) assumes a paramount role in bolstering public security. During the process of computing feature similarities among person images, misaligned and occluded body parts may impede accurate identity retrieval. To mitigate these chal...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2024, Vol.12, p.175445-175457
Hauptverfasser: Lyu, Chunyan, Huang, Hai, Zhang, Lixi, Zhu, Wenting, Wang, Zhengyang, Wang, Kejun, Jiao, Caidong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As a pivotal computer vision technique, person re-identification (re-ID) assumes a paramount role in bolstering public security. During the process of computing feature similarities among person images, misaligned and occluded body parts may impede accurate identity retrieval. To mitigate these challenges, we introduce a Multi-Branch Feature Alignment Network (MBFA) comprising three distinct deep neural network branches. Primarily, the global feature branch is tailored to extract comprehensive features. Subsequently, the pose alignment branch is formulated to acquire segmented features via a specific feature-weighted fusion strategy. Finally, the semantic alignment branch is devised to derive high-order semantic features at a pixel level, enabling precise localization of visible parts in occluded pedestrians and focusing similarity computations on these regions. The integration of multi-scale feature information synergistically complements one another, resulting in feature alignment that augments the robustness and discrimination capabilities of the entire network. Consequently, MBFA adeptly mitigates the interferences caused by misalignment and occlusion. Across three prominent re-ID datasets and an occluded re-ID dataset, experimental results unequivocally affirm the superiority of our proposed methodology over existing state-of-the-art methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3492312