
Deep learning‐based household electric energy consumption forecasting
Author(s) -
Hyeon Jonghwan,
Lee HyeYoung,
Ko Bowon,
Choi HoJin
Publication year - 2020
Publication title -
the journal of engineering
Language(s) - English
Resource type - Journals
ISSN - 2051-3305
DOI - 10.1049/joe.2019.1219
Subject(s) - computer science , sequence (biology) , electric energy consumption , energy consumption , deep learning , artificial intelligence , metric (unit) , energy (signal processing) , consumption (sociology) , sequence learning , mean squared error , machine learning , electric energy , engineering , statistics , mathematics , operations management , electrical engineering , genetics , biology , social science , power (physics) , physics , quantum mechanics , sociology
With the advent of various electronic products, the household electric energy consumption is continuously increasing, and therefore it becomes very important to predict the household electric energy consumption accurately. Energy prediction models also have been developed for decades with advanced machine learning technologies. Meanwhile, the deep learning models are still actively under study, and many newer models show the state‐of‐the‐art performance. Therefore, it would be meaningful to conduct the same experiment with these new models. Here, the authors predict the household electric energy consumption using deep learning models, known to be suitable for dealing with time‐series data. Specifically, vanilla long short‐term memory (LSTM), sequence to sequence, and sequence to sequence with attention mechanism are used to predict the electric energy consumption in the household. As a result, the vanilla LSTM shows the best performance on the root‐mean‐square error metric. However, from a graphical point of view, it seems that the sequence‐to‐sequence model predicts the energy consumption patterns best and the vanilla LSTM does not follow the pattern well. Also, to achieve the best performance of each deep learning model, vanilla LSTM, sequence to sequence, and sequence to sequence with attention mechanism should observe past 72, 72, and 24 h, respectively.