
Incremental regularized extreme learning machine based on Cholesky factorization and its application to time series prediction
Author(s) -
Xian Zhou,
Hongli Wang
Publication year - 2011
Publication title -
wuli xuebao
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.199
H-Index - 47
ISSN - 1000-3290
DOI - 10.7498/aps.60.110201
Subject(s) - cholesky decomposition , computer science , extreme learning machine , chaotic , series (stratigraphy) , generalization , algorithm , artificial neural network , factorization , time series , machine learning , artificial intelligence , mathematics , paleontology , mathematical analysis , eigenvalues and eigenvectors , physics , quantum mechanics , biology
In order to solve the hidden-layer neuron determination problem of regularized extreme learning machine (RELM) applied to chaotic time series prediction, a new algorithm based on Cholesky factorization is proposed. First, an RELM-based prediction model with one hidden-layer neuron is constructed and then a new hidden-layer neuron is added to the prediction model in each training step until the generalization performance of the prediction model reaches its peak value. Thus, the optimal network structure of the prediction model is determined. In the training procedure, Cholesky factorization is used to calculate the output weights of RELM. Experiments on chaotic time series prediction indicate that the algorithm can be effectively used to determine the optimal network strueture of RELM, and the prediction model trained by the algorithm has excellent performance in prediction accuracy and computational cost.