z-logo
open-access-imgOpen Access
New non‐negative sparse feature learning approach for content‐based image retrieval
Author(s) -
Xu Wangming,
Wu Shiqian,
Er Meng Joo,
Zheng Chaobing,
Qiu Yimin
Publication year - 2017
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2016.0726
Subject(s) - pooling , neural coding , pattern recognition (psychology) , computer science , artificial intelligence , locality , feature (linguistics) , sparse approximation , cluster analysis , coding (social sciences) , content based image retrieval , image retrieval , feature learning , representation (politics) , image (mathematics) , mathematics , philosophy , statistics , linguistics , politics , law , political science
One key issue in content‐based image retrieval is to extract effective features so as to represent the visual content of an image. In this study, a new non‐negative sparse feature learning approach to produce a holistic image representation based on low‐level local features is presented. Specifically, a modified spectral clustering method is introduced to learn a non‐negative visual dictionary from local features of training images. A non‐negative sparse feature encoding method termed non‐negative locality‐constrained linear coding (NNLLC) is proposed to improve the popular locality‐constrained linear coding method so as to obtain more meaningful and interpretable sparse codes for feature representation. Moreover, a new feature pooling strategy named kMaxSum pooling is proposed to alleviate the information loss of the sum pooling or max pooling strategy, which produces a more effective holistic image representation and can be viewed as a generalisation of the sum and max pooling strategies. The retrieval results carried out on two public image databases demonstrate the effectiveness of the proposed approach.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here