Dynamic Feature Pruning and Consolidation for Occluded Person Re-Identification
Occluded person re-identification (ReID) is a challenging problem due to contamination from occluders. Existing approaches address the issue with prior knowledge cues, such as human body key points and semantic segmentations, which easily fail in the presence of heavy occlusion and other humans as o...
Gespeichert in:
Hauptverfasser: | , , , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Occluded person re-identification (ReID) is a challenging problem due to
contamination from occluders. Existing approaches address the issue with prior
knowledge cues, such as human body key points and semantic segmentations, which
easily fail in the presence of heavy occlusion and other humans as occluders.
In this paper, we propose a feature pruning and consolidation (FPC) framework
to circumvent explicit human structure parsing. The framework mainly consists
of a sparse encoder, a multi-view feature mathcing module, and a feature
consolidation decoder. Specifically, the sparse encoder drops less important
image tokens, mostly related to background noise and occluders, solely based on
correlation within the class token attention. Subsequently, the matching stage
relies on the preserved tokens produced by the sparse encoder to identify
k-nearest neighbors in the gallery by measuring the image and patch-level
combined similarity. Finally, we use the feature consolidation module to
compensate pruned features using identified neighbors for recovering essential
information while disregarding disturbance from noise and occlusion.
Experimental results demonstrate the effectiveness of our proposed framework on
occluded, partial, and holistic Re-ID datasets. In particular, our method
outperforms state-of-the-art results by at least 8.6\% mAP and 6.0\% Rank-1
accuracy on the challenging Occluded-Duke dataset. |
---|---|
DOI: | 10.48550/arxiv.2211.14742 |