HHF: Hashing-guided Hinge Function for Deep Hashing Retrieval
Deep hashing has shown promising performance in large-scale image retrieval. The hashing process utilizes Deep Neural Networks (DNNs) to embed images into compact continuous latent codes, then map them into binary codes by hashing function for efficient retrieval. Recent approaches perform metric lo...
Gespeichert in:
Veröffentlicht in: | IEEE transactions on multimedia 2023-01, Vol.25, p.1-12 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Deep hashing has shown promising performance in large-scale image retrieval. The hashing process utilizes Deep Neural Networks (DNNs) to embed images into compact continuous latent codes, then map them into binary codes by hashing function for efficient retrieval. Recent approaches perform metric loss and quantization loss to supervise the two procedures that cluster samples with the same categories and alleviate semantic information loss after binarization in the end-to-end training framework. However, we observe the incompatible conflict that the optimal cluster positions are not identical to the ideal hash positions because of the different objectives of the two loss terms, which lead to severe ambiguity and error-hashing after the binarization process. To address the problem, we borrow the Theory of Minimum-Distance Bounds for Binary Linear Codes to design the inflection point that depends on the hash bit length and category numbers and thereby propose Hashing-guided Hinge Function (HHF) to explicitly enforce the termination of metric loss to prevent the negative pairs unlimited alienated. Such modification is proven effective and essential for training, which contributes to proper intra- and inter-distances for clusters and better hash positions for accurate image retrieval simultaneously. Extensive experiments in CIFAR-10, CIFAR-100, ImageNet, and MS-COCO justify that HHF consistently outperforms existing techniques and is robust and flexible to transplant into other methods. Code is available at https://github.com/JerryXu0129/HHFhttps://github.com/JerryXu0129/HHF. |
---|---|
ISSN: | 1520-9210 1941-0077 |
DOI: | 10.1109/TMM.2022.3222598 |