Learning Adaptive Hypersphere: Boosting Efficiency on Approximate Nearest Neighbor Search
Existing approximate nearest neighbor (ANN) search, whether exhaustive or non-exhaustive manner, needs to rerank candidate vectors to return final search results after finding candidate vectors. Then, the number of candidate vectors has important impacts on search time costs. In this letter, a hyper...
Gespeichert in:
Veröffentlicht in: | IEEE signal processing letters 2024, Vol.31, p.2190-2194 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Existing approximate nearest neighbor (ANN) search, whether exhaustive or non-exhaustive manner, needs to rerank candidate vectors to return final search results after finding candidate vectors. Then, the number of candidate vectors has important impacts on search time costs. In this letter, a hypersphere-based filtration mechanism is proposed to accelerate ANN search with no harm to search accuracy. To learn the hypersphere model, a fully connected neural network is employed in which training data of independence on vector dimension is constructed, such that the hypersphere model can generalize to various datasets of different vector dimension without any architectural adjustments. An adaptive size of the hypersphere is generated for each query vector. Only the candidate vectors located inside the hypersphere are kept and taken into reranking, thus, the number of reranking vectors is furthest reduced. The proposed hypersphere-based filtration can be conveniently integrated with various quantization-based ANN search methods. Experimental results on two public datasets demonstrate that the proposed hypersphere-based filtration can boost search efficiency effectively with no negative impacts on search accuracy. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2024.3427696 |