A bi-network architecture for occlusion handling in Person re-identification
Person re-identification in a multi-camera setup is very important for tracking and monitoring the movement of individuals in public places. It is not always possible to capture human shape accurately using surveillance cameras due to occlusion caused by other individuals and/or objects. Only a few...
Gespeichert in:
Veröffentlicht in: | Signal, image and video processing image and video processing, 2022-06, Vol.16 (4), p.1071-1079 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Person re-identification in a multi-camera setup is very important for tracking and monitoring the movement of individuals in public places. It is not always possible to capture human shape accurately using surveillance cameras due to occlusion caused by other individuals and/or objects. Only a few existing approaches consider the challenging problem of occlusion handling in person re-identification. We propose an effective bi-network architecture to carry out re-identification after occlusion reconstruction. Our architecture, termed as
Occlusion Handling GAN
(
OHGAN
), is based on the popular U-Net architecture and is trained using
L2
loss and binary cross-entropy loss. Due to unavailability of re-identification datasets with occlusion, the gallery set to train the network has been generated by synthetically adding occlusion of varying degrees to existing non-occluded datasets. Qualitative results show that our
OHGAN
performs reconstruction of occluded frames quite satisfactorily. Next, re-identification using the reconstructed frames has been performed using
Part-based Convolution Baseline (PCB)
. We carry out extensive experiments and compare the results of our proposed method with 11 state-of-the-art approaches on four public datasets. Results show that our method outperforms all other existing techniques. |
---|---|
ISSN: | 1863-1703 1863-1711 |
DOI: | 10.1007/s11760-021-02056-4 |