z-logo
open-access-imgOpen Access
A Fusion Algorithm of Multi-model Pruning and Collaborative Distillation Learning
Author(s) -
Zihan Liu,
Zhiguo Shi
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1607/1/012096
Subject(s) - pruning , computer science , distillation , process (computing) , machine learning , artificial intelligence , algorithm , function (biology) , chemistry , organic chemistry , evolutionary biology , agronomy , biology , operating system
In order to improve the prediction performance, the complex depth network model needs a large number of calculation parameters, and the negative impact is the fact that it greatly increases the calculation time and energy consumption. In this paper, a fusion algorithm of model pruning and joint multi-model knowledge distillation learning is proposed. By constructing an adaptive joint learning loss function including distillation, multi-model training is carried out. This method can replace the training process of model fine-tuning after pruning. In this paper, firstly, multi classification tasks are carried out on different model structures and data sets, and a complex network model with good effect is trained as a teacher’s model. Then, the channel pruning process of random pruning degree is executed to generate multiple student models. Finally, the accuracy of multiple student models is improved by using this method. The experimental results show that this method effectively improves the accuracy of each model after pruning, and the experimental results achieve a suitable balance between model performance and accuracy.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here