Open AccessHypercomplex neural network in time series forecasting of stock dataOpen Access
Author(s)
Radosław Kycia,
Agnieszka Niemczynowicz
Publication year2024
The three classes of architectures for time series prediction were tested.They differ by input layers which contain either convolutional, LSTM, or densehypercomplex layers for 4D algebras. The input was four related Stock Markettime series, and the prediction of one of them is expected. The optimization ofhyperparameters related to the classes of architectures was performed in orderto compare the best neural networks within the class. The results show that inmost cases, the architecture with a hypercomplex dense layer provides similarMAE accuracy to other architectures, however, with considerably less trainableparameters. Thanks to it, hypercomplex neural networks can be learned andprocess data faster than the other tested architectures. Moreover, the order ofthe input time series has an impact on effectively.
Language(s)English
Seeing content that should not be on Zendy? Contact us.
To access your conversation history and unlimited prompts, please
Prompt 0/10