z-logo
open-access-imgOpen Access
Fine-grained classification based on multi-scale pyramid convolution networks
Author(s) -
Gaihua Wang,
Lei Cheng,
Jinheng Lin,
Dai Yingying,
Tianlun Zhang
Publication year - 2021
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0254054
Subject(s) - kernel (algebra) , computer science , convolution (computer science) , pyramid (geometry) , artificial intelligence , pattern recognition (psychology) , variance (accounting) , feature (linguistics) , scale (ratio) , residual , class (philosophy) , data mining , machine learning , artificial neural network , algorithm , mathematics , linguistics , philosophy , physics , geometry , accounting , combinatorics , quantum mechanics , business
The large intra-class variance and small inter-class variance are the key factor affecting fine-grained image classification. Recently, some algorithms have been more accurate and efficient. However, these methods ignore the multi-scale information of the network, resulting in insufficient ability to capture subtle changes. To solve this problem, a weakly supervised fine-grained classification network based on multi-scale pyramid is proposed in this paper. It uses pyramid convolution kernel to replace ordinary convolution kernel in residual network, which can expand the receptive field of the convolution kernel and use complementary information of different scales. Meanwhile, the weakly supervised data augmentation network (WS-DAN) is used to prevent over fitting and improve the performance of the model. In addition, a new attention module, which includes spatial attention and channel attention, is introduced to pay more attention to the object part in the image. The comprehensive experiments are carried out on three public benchmarks. It shows that the proposed method can extract subtle feature and achieve classification effectively.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here