z-logo
open-access-imgOpen Access
Add a SideNet to your MainNet
Author(s) -
Adrien Morisot
Publication year - 2022
Publication title -
proceedings of the northern lights deep learning workshop
Language(s) - English
Resource type - Journals
ISSN - 2703-6928
DOI - 10.7557/18.6286
Subject(s) - softmax function , computer science , pruning , computational complexity theory , artificial neural network , perceptron , artificial intelligence , layer (electronics) , memory footprint , simple (philosophy) , footprint , machine learning , pattern recognition (psychology) , algorithm , paleontology , philosophy , chemistry , organic chemistry , epistemology , agronomy , biology , operating system
As the performance and popularity of deep neural networks has increased, so too has their computational cost. There are many effective techniques for reducing a network’s computational footprint--quantisation, pruning, knowledge distillation--, but these lead to models whose computational cost is the same regardless of their input. Our human reaction times vary with the complexity of the tasks we perform: easier tasks--e.g. telling apart dogs from boats--are executed much faster than harder ones--e.g. telling apart two similar-looking breeds of dogs. Driven by this observation, we develop a method for adaptive network complexity by attaching a small classification layer, which we call SideNet, to a large pretrained network, which we call MainNet. Given an input, the SideNet returns a classification if its confidence level, obtained via softmax, surpasses a user-determined threshold, and only passes it along to the large MainNet for further processing if its confidence is too low. This allows us to flexibly trade off the network’s performance with its computational cost. Experimental results show that simple single hidden layer perceptron SideNets added onto pretrained ResNet and BERT MainNets allow for substantial decreases in compute with minimal drops in performance on image and text classification tasks.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here