
Comparison of Time Series Forecasting Based on Statistical ARIMA Model and LSTM with Attention Mechanism
Author(s) -
Kun Zhou,
Wen Yong Wang,
Teng Hu,
Chen Huang Wu
Publication year - 2020
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1631/1/012141
Subject(s) - autoregressive integrated moving average , computer science , artificial intelligence , task (project management) , time series , machine learning , artificial neural network , recurrent neural network , deep learning , mechanism (biology) , data mining , engineering , systems engineering , philosophy , epistemology
Time series Forecasting (TSF) has been a research hotspot and widely applied in many areas such as financial, bioinformatics, social sciences and engineering. This article aimed at comparing the forecasting performances using the traditional Auto-Regressive Integrated Moving Average (ARIMA) model with the deep neural network model of Long Short Term Memory (LSTM) with attention mechanism which achieved great success in sequence modelling. We first briefly introduced the basics of ARIMA and LSTM with attention models, summarized the general steps of constructing the ARIMA model for the TSF task. We obtained the dataset from Kaggle competition web traffic and modelled them as TSF problem. Then the LSTM with attention mechanism model was proposed to the TSF. Finally forecasting performance comparisons were conducted using the same dataset under different evaluation metrics. Both models achieved comparable results with the up-to-date methods and LSTM slightly outperformed the classical counterpart in TSF task.