
Simplifying Long Short-Term Memory for Fast Training and Time Series Prediction
Author(s) -
Yuyan Zhang,
Xin Hao,
Yong Liu
Publication year - 2019
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1213/4/042039
Subject(s) - univariate , computer science , term (time) , computation , set (abstract data type) , series (stratigraphy) , time series , long short term memory , long term prediction , multivariate statistics , algorithm , short term memory , data set , training set , artificial intelligence , machine learning , artificial neural network , working memory , paleontology , telecommunications , physics , cognition , quantum mechanics , neuroscience , recurrent neural network , biology , programming language
Long short Term Memory(LSTM) has been widely used in sequencial problems. However, for the time series prediction problems, its complex structure limits its running speed and performance. In order to solve this problem, this paper simplified the standard LSTM model by reducing the number of gates and the parameters involved in gates computation. Experiments on univariate data set and multivariate data set show that the proposed simplified model not only has better accuracy, but also has higher running speed.