
A REVIEW ON HYPER-PARAMETER OPTIMISATION BY DEEP LEARNING EXPERIMENTS
Author(s) -
Rohan Bhattacharjee,
Debjyoti Ghosh,
Abhishek Mazumder
Publication year - 2021
Publication title -
journal of mathematical sciences and computational mathematics
Language(s) - English
Resource type - Journals
eISSN - 2688-8300
pISSN - 2644-3368
DOI - 10.15864/jmscm.2407
Subject(s) - computer science , artificial intelligence , machine learning , process (computing) , set (abstract data type) , deep learning , point (geometry) , interface (matter) , mathematics , geometry , bubble , maximum bubble pressure method , parallel computing , programming language , operating system
It has been found that during the runtime of a deep learning experiment, the intermediate resultant values get removed while the processes carry forward. This removal of data forces the interim experiment to roll back to a certain initial point after which the hyper-parameters or results become difficult to obtain (mostly for a vast set of experimental data). Hyper-parameters are the various constraints/measures that a learning model requires to generalise distinct data patterns and control the learning process. A proper choice and optimization of these hyper-parameters must be made so that the learning model is capable of resolving the given machine learning problem and during training, a specific performance objective for an algorithm on a dataset is optimised. This review paper aims at presenting a Parameter Optimisation for Learning (POL) model highlighting the all-round features of a deep learning experiment via an application-based programming interface (API). This provides the means of stocking, recovering and examining parameters settings and intermediate values. To ease the process of optimisation of hyper-parameters further, the model involves the application of optimisation functions, analysis and data management. Moreover, the prescribed model boasts of a higher interactive aspect and is circulating across a number of machine learning experts, aiding further utility in data management.