Deep Covariance Estimation Hashing

Deep hashing, the combination of advanced convolutional neural networks and efficient hashing, has recently achieved impressive performance for image retrieval. However, state-of-the-art deep hashing methods mainly focus on constructing hash function, loss function and training strategies to preserv...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE access 2019, Vol.7, p.113223-113234
Hauptverfasser: Wu, Yue, Sun, Qiule, Hou, Yaqing, Zhang, Jianxin, Zhang, Qiang, Wei, Xiaopeng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep hashing, the combination of advanced convolutional neural networks and efficient hashing, has recently achieved impressive performance for image retrieval. However, state-of-the-art deep hashing methods mainly focus on constructing hash function, loss function and training strategies to preserve semantic similarity. For the fundamental image characteristics, they depend heavily on the first-order convolutional feature statistics, failing to take their global structure into consideration. To address this problem, we present a deep covariance estimation hashing (DCEH) method with robust covariance form to improve hash code quality. The core of DCEH involves covariance pooling as deep hashing representation, performing global pairwise feature interactions. The covariance pooling can capture richer statistic information of deep convolutional features and produce more informative global representations.Due to convolutional features are usually high dimension and small sample size, we estimate robust covariance by shrinking its eigenvalues using power normalization, forming an independent structural layer. Then the structural layer is embedded into deep hashing paradigm in an end-to-end learning manner. Extensive experiments on three benchmarks show that the proposed DCEH outperforms its counterparts and achieves superior performance.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2934321