Dual Domain Multi-Task Model for Vehicle Re-Identification

Vehicle re-identification (re-id) is an essential task in the field of intelligent transportation systems (ITS). The main goal of re-id is to find the same vehicle in different scenarios, which can is still a challenging task in both ITS and computer vision (CV). The existing vehicle re-identificati...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on intelligent transportation systems 2022-04, Vol.23 (4), p.2991-2999
Hauptverfasser: Huang, Yue, Liang, Borong, Xie, Weiping, Liao, Yinghao, Kuang, Zhenyu, Zhuang, Yihong, Ding, Xinghao
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Vehicle re-identification (re-id) is an essential task in the field of intelligent transportation systems (ITS). The main goal of re-id is to find the same vehicle in different scenarios, which can is still a challenging task in both ITS and computer vision (CV). The existing vehicle re-identification methods simply combine the coarse-grained and the fine-grained attributes together with multi-task training. However, such combination may still have limited performance in vehicles with trivial appearance differences, or with rare models and colors. To solve this problem, we propose a simple yet effective framework, called dual domain multi-task model (DDM), that divides the vehicle images into two domains based on the frequency. And then two parallel branches are proposed to recover the two domains. Furthermore, a multi-task method is proposed, which combines the classification loss in color and model together with triplet loss for fine-grained distance measurement. Besides, a progressive strategy is used in the training process. Two public datasets, PKU VehicleID and VeRi are used to validate the proposed DDM. The experimental results demonstrate that the proposed approach outperforms the existing methods on both datasets.
ISSN:1524-9050
1558-0016
DOI:10.1109/TITS.2020.3027578