z-logo
open-access-imgOpen Access
Acceleration of Infrared Target Detection via Efficient Channel Pruning
Author(s) -
Miao Zhang,
Yong Zhang,
Weihua Li,
Ruimin Chen
Publication year - 2022
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2203/1/012025
Subject(s) - normalization (sociology) , computer science , flops , pruning , acceleration , artificial intelligence , channel (broadcasting) , overhead (engineering) , algorithm , parallel computing , computer network , physics , classical mechanics , sociology , anthropology , agronomy , biology , operating system
Deployment of modern detection models is difficult for infrared target detection used in robot vision systems due to their heavy computational burden. To alleviate this situation, a simple but efficient channel pruning method is proposed for model acceleration. Specifically, a soft-gated module combined with batch normalization (SGBN) is designed as a standalone layer to substitute the standard batch normalization (BN) layer during training. The conversion between SGBN and BN is easy, and the training overhead introduced is almost negligible after replacement. By controlling the sparsity of the scaling factor in SGBN, unimportant channels with small output are blocked automatically and globally, which is simultaneous with model training. Removing these redundant channels no longer requires fine-tuning, thus significantly speeding up the pruning process. Experiments of pruning different detection models on the infrared dataset show the effectiveness of our method. For example, the parameters and FLOPs of pruned CetnerNet are reduced by 72.70% and 40.20%, respectively, without accuracy loss. The inference speed on the CPU is 12.01ms faster. Extended studies on the classification task also demonstrate its great potential when transferring to other applications.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here