z-logo
open-access-imgOpen Access
Acceleration of Target Detection Based on Forced Knowledge Distillation
Author(s) -
Jing Wen,
Chenggui Gong
Publication year - 2019
Publication title -
iop conference series. materials science and engineering
Language(s) - English
Resource type - Journals
eISSN - 1757-899X
pISSN - 1757-8981
DOI - 10.1088/1757-899x/612/3/032007
Subject(s) - pascal (unit) , computer science , distillation , acceleration , artificial intelligence , computation , feature extraction , machine learning , algorithm , chemistry , physics , organic chemistry , classical mechanics , programming language
In recent years, deep learning has achieved outstanding results on many problems such as computer vision, natural language processing and so on. The research of network model compression and acceleration can make the network model run efficiently on resource-constrained devices by greatly reducing the amount of computation of the network when the performance of the network decreases slightly. At present, knowledge distillation has achieved good results in classification tasks, but it shows strong limitations in the face of more complex tasks like detection. In this work, we propose a forced knowledge distillation framework, which focuses on improving the ability of feature extraction in detection. The whole model can be compact and fast without losing much accuracy. We use PASCAL VOC, KITTI and MSCOCO data sets to make a comprehensive evaluation. The results show that the forced knowledge distillation framework can fully learn the knowledge of teacher networks and achieve better detection results in smaller student networks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here