Bike-Person Re-Identification: A Benchmark and a Comprehensive Evaluation

Existing person re-identification (re-id) datasets only consist of pedestrian images, which are far more behind what the real surveillance system holds. As investigated a real camera in a whole daytime, we find that there are more than 40% persons are riding bikes rather than walking. However, such...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2018, Vol.6, p.56059-56068
Hauptverfasser: Yuan, Yuan, Zhang, Jian'an, Wang, Qi
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Existing person re-identification (re-id) datasets only consist of pedestrian images, which are far more behind what the real surveillance system holds. As investigated a real camera in a whole daytime, we find that there are more than 40% persons are riding bikes rather than walking. However, such kind of person re-id (we named bike-person re-id) has not been focused on yet. In this paper, we pay attention to the bike person re-id for the first time and proposed a large new bike-person re-id dataset named BPReid to address such a novel and practical problem. BPReid distinguishes existing person re-id datasets in three aspects. First, it is the first bike-person re-id dataset with largest identities by far. Second, it samples from a subset of real surveillance system which makes it a realistic benchmark. Third, there is a long instance between two cameras which makes it a wide area benchmark. Besides, we also proposed a new pipeline designed for bike person re-id by automatically partitioning a bike person image in two parts (bike and person) for feature extraction. Experiments on the proposed BPReid dataset show the effectiveness of the proposed pipeline. Finally, we also provide a comprehensive evaluation of existing re-id algorithms on this dataset, including feature representation methods as well as metric learning methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2018.2872804