z-logo
open-access-imgOpen Access
Online Tuning of Hyperparameters in Deep LSTM for Time Series Applications
Author(s) -
Norah Bakhashwain,
Alaa Sagheer
Publication year - 2021
Publication title -
international journal of intelligent engineering and systems
Language(s) - English
Resource type - Journals
eISSN - 2185-310X
pISSN - 1882-708X
DOI - 10.22266/ijies2021.0228.21
Subject(s) - hyperparameter , computer science , deep learning , artificial intelligence , machine learning , artificial neural network , series (stratigraphy) , deep neural networks , computation , recurrent neural network , algorithm , paleontology , biology
Deep learning is one of the most remarkable artificial intelligence trends. It stands behind numerous recent achievements in several domains, such as speech processing, and computer vision, to mention a few. Accordingly, these achievements have sparked great attention to employing deep learning in time series modelling and forecasting. It is known that the deep learning algorithms built on neural networks contain multiple hidden layers, which make the computation of deep neural network challenging and, sometimes, complex. The reason for this complexity is that obtaining an outstanding and consistent result from such deep architecture requires optimizing many parameters known as hyperparameters. Doubtless, hyperparameter tuning plays a critical role in improving the performance of deep learning. This paper proposes an online tuning approach for the hyperparameters of deep long short-term memory (DLSTM) model in a dynamic fashion. The proposed approach adapts to learn any time series based application, particularly the applications that contain streams of data. The experimental results show that the dynamic tuning of the DLSTM hyperparameters performs better than the original static tuning fashion.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here