Progressive Similarity Preservation Learning for Deep Scalable Product Quantization

Product quantization is an effective strategy for compact feature learning in image retrieval, which generates compact quantization codes of different lengths for varying scenarios. However, existing deep quantization methods obtain quantization codes with different lengths by training multiple mode...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on multimedia 2024, Vol.26, p.3034-3045
Hauptverfasser: Du, Yongchao, Wang, Min, Zhou, Wengang, Li, Houqiang
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Product quantization is an effective strategy for compact feature learning in image retrieval, which generates compact quantization codes of different lengths for varying scenarios. However, existing deep quantization methods obtain quantization codes with different lengths by training multiple models separately for each code length, which brings about large training time cost and degrades deployment flexibility. To this end, we propose a new deep scalable Progressive Similarity Preservation Product Quantization (PSPPQ) framework, which enables us to train the quantized features in different code lengths simultaneously and imposes no additional cost during inference. By progressively approximating the ground truth similarity of image pairs, we achieve direct optimization of similarity ranking, which improves the retrieval accuracy and generates sequential quantization codes with more efficiency. Besides, by combining the advantages of classification loss and hinge loss, we design a semantic ArcFace loss to optimize our network architecture. Experiments on three datasets demonstrate the effectiveness of our proposed method with variable code lengths for scalable image retrieval.
ISSN:1520-9210
1941-0077
DOI:10.1109/TMM.2023.3306556