z-logo
open-access-imgOpen Access
Thinning of convolutional neural network with mixed pruning
Author(s) -
Yang Wenzhu,
Jin Lilei,
Wang Sile,
Cu Zhenchao,
Chen Xiangyang,
Chen Liping
Publication year - 2019
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2018.6191
Subject(s) - pruning , computer science , mnist database , convolutional neural network , filter (signal processing) , redundancy (engineering) , artificial intelligence , artificial neural network , algorithm , pattern recognition (psychology) , computer vision , agronomy , biology , operating system
Deep learning has achieved state‐of‐the‐art performance in accuracy of many computer vision tasks. However, convolutional neural network is difficult to deploy on resource constrained devices due to their limited computation power and memory space. Thus, it is necessary to prune the redundant weights and filters rationally and effectively. Considering that the pruned model still exists, redundancy after weight pruning or filter pruning alone, a method of combining weight pruning and filter pruning is proposed. First, filter pruning is performed, which is to remove filters with least importance and using fine‐tuning to recover the model's accuracy. Then, all connection weights below a threshold are set to zero. Finally, the pruned model obtained by the first two steps is fine‐tuned to recover its predictive accuracy. Experiments on MNIST and CIFAR‐10 datasets demonstrate that the proposed approach is effective and feasible. Compared with only weight pruning or filter pruning, the mixed pruning can achieve higher compression ratio of the model parameters. For LeNet‐5, the proposed approach can achieve a compression rate of 13.01×, with 1% drop in accuracy. For VGG‐16, it can achieve a compression rate of 19.20×, incurring 1.56% accuracy loss.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here