Premium
Dense short connection network for efficient image classification
Author(s) -
Yang Zhenkun,
Ma Xianghua,
Wang Kefan,
An Jing
Publication year - 2021
Publication title -
concurrency and computation: practice and experience
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.309
H-Index - 67
eISSN - 1532-0634
pISSN - 1532-0626
DOI - 10.1002/cpe.6186
Subject(s) - computer science , convolutional neural network , feature (linguistics) , pattern recognition (psychology) , artificial intelligence , representation (politics) , context (archaeology) , scale (ratio) , image (mathematics) , connection (principal bundle) , layer (electronics) , computation , machine learning , algorithm , mathematics , linguistics , physics , geometry , chemistry , organic chemistry , quantum mechanics , politics , political science , law , biology , paleontology , philosophy
With the continuous development of convolutional neural networks (CNNs), image classification technology has entered a new stage in solving visual cognition tasks. Recent advances have shown that if containing short connections, convolutional networks can be more accurate and efficient to train. However, these short connections almost only exist between convolutional layers, which potentially make the feature information flow insufficiently. In addition, the multi‐scale representation ability of CNNs using short connections can be further explored. Thus, the dense short connection network (DSCNet) for efficient image classification is designed in this paper. In DSCNet, we propose a simple yet effective architectural unit, namely dense short connection (DSC) module, which allows hierarchical dense short connections for multi‐scale context information across different feature channels within a single convolutional layer. DSCNets have several noteworthy advantages: they improve the multi‐scale representation ability, enhance feature propagation, and reduce the number of parameters. To validate our DSC, we conduct comprehensive experiments on CIFAR‐100 and Tiny ImageNet datasets. Experimental results show that DSCNets achieve very competitive results with previous baseline models, whilst requiring less computation to achieve higher recognition accuracy. Further ablation studies show that our proposed method can achieve consistent performance gains in image classification tasks.