
Accumulative Quantization for Approximate Nearest Neighbor Search
Author(s) -
Liefu Ai,
Yong Tao,
Hongjun Cheng,
Yuanzhi Wang,
Shaoguo Xie,
Deyang Liu,
Xin Zheng
Publication year - 2022
Publication title -
computational intelligence and neuroscience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.605
H-Index - 52
eISSN - 1687-5273
pISSN - 1687-5265
DOI - 10.1155/2022/4364252
Subject(s) - hypersphere , codebook , vector quantization , quantization (signal processing) , computer science , centroid , linde–buzo–gray algorithm , pattern recognition (psychology) , learning vector quantization , artificial intelligence , mathematics , algorithm , mathematical optimization
To further improve the approximate nearest neighbor (ANN) search performance, an accumulative quantization (AQ) is proposed and applied to effective ANN search. It approximates a vector with the accumulation of several centroids, each of which is selected from a different codebook. To provide accurate approximation for an input vector, an iterative optimization is designed when training codebooks for improving their approximation power. Besides, another optimization is introduced into offline vector quantization procedure for the purpose of minimizing overall quantization errors. A hypersphere-based filtration mechanism is designed when performing AQ-based exhaustive ANN search to reduce the number of candidates put into sorting, thus yielding better search time efficiency. For a query vector, a self-centered hypersphere is constructed, so that those vectors not lying in the hypersphere are filtered out. Experimental results on public datasets demonstrate that hypersphere-based filtration can improve ANN search time efficiency with no weakening of search accuracy; besides, the proposed AQ is superior to the state of the art on ANN search accuracy.