z-logo
open-access-imgOpen Access
Fast Retrieval Method of Image Data Based on Learning to Hash
Author(s) -
Shiyuan Fang,
Jianfeng Wang,
Cheng Yang,
Pengpeng Tong
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1631/1/012029
Subject(s) - hash function , computer science , feature hashing , double hashing , pattern recognition (psychology) , image retrieval , hash table , rolling hash , artificial intelligence , code (set theory) , feature (linguistics) , dynamic perfect hashing , image (mathematics) , set (abstract data type) , linguistics , philosophy , computer security , programming language
Hash is a method that is widely used in nearest neighbor retrieval, and its goal is to convert high-dimensional image data into low-dimensional representations or into a set of ordered binary code. As one of the more efficient methods of data storage and retrieval, the hash method is widely used in the nearest neighbor retrieval of large-scale image data. The traditional hashing method generates a hash code by manually extracting features, so that the feature and the hash code do not have the best fit, so that the generated hash code is suboptimal. The rapid development of deep learning makes the computer have a good effect on image visual feature recognition. Combining the learning of hash function makes the performance better than the traditional hash method. In this paper, a deep hash method based on triple constraint is proposed to extract the similarity features of the same category and distinguish the features between different categories. Further learning the hash code makes the similarity of the image preserved. Experiments show that the Hash-based learning method has better performance on CIFAR-10 and NUS-WIDE than other methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here