z-logo
open-access-imgOpen Access
Dropout- A Detailed Survey
Author(s) -
Sakshi S Lad
Publication year - 2021
Publication title -
international journal for research in applied science and engineering technology
Language(s) - English
Resource type - Journals
ISSN - 2321-9653
DOI - 10.22214/ijraset.2021.36499
Subject(s) - dropout (neural networks) , overfitting , artificial neural network , computer science , artificial intelligence , machine learning , drop out , deep learning , convergence (economics) , economics , demographic economics , economic growth
Deep Neural Networks are very complex and have large number of parameters. Shortlisting the parameters that influence the model prediction is not possible as each has equal significance. These neural nets have powerful learning skills can model training data well enough. However, in most of these conditions, the models are over-fitting. Combining predictions from large neural nets where neurons are co-dependent alters the performance of the model. Dropout addresses the problem of overfitting and slow convergence in deep neural nets. The core concept of dropout technique is to randomly drop units and their connections from the neural network during training phase. This prevents units from co-adapting and thus improving the performance. The central mechanism behind dropout is to take a large model that overfits easily and repeatedly sample and train smaller sub-models from it. This paper provides an introduction to dropout, the history behind its design and various dropout methods.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here