z-logo
open-access-imgOpen Access
A Review on Content Based Image Retrieval System Features derived by Deep Learning Models
Author(s) -
M. Naveen
Publication year - 2021
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2021.39172
Subject(s) - computer science , image retrieval , cluster analysis , content based image retrieval , similarity (geometry) , artificial intelligence , pattern recognition (psychology) , deep learning , image (mathematics) , feature (linguistics) , set (abstract data type) , task (project management) , data mining , linguistics , philosophy , management , economics , programming language
In a Content Based Image Retrieval (CBIR) System, the task is to retrieve similar images from a large database given a query image. The usual procedure is to extract some useful features from the query image, and retrieve images which have similar set of features. For this purpose, a suitable similarity measure is chosen, and images with high similarity scores are retrieved. Naturally the choice of these features play a very important role in the success of this system, and high level features are required to reduce the “semantic gap”. In this paper, we propose to use features derived from pre-trained network models from a deep- learning convolution network trained for a large image classification problem. This approach appears to produce vastly superior results for a variety of databases, and it outperforms many contemporary CBIR systems. We analyse the retrieval time of the method, and also propose a pre-clustering of the database based on the above-mentioned features which yields comparable results in a much shorter time in most of the cases. Keywords Content Based Image Retrieval Feature Selection Deep Learning Pre-trained Network Models Pre-clustering

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here