Convergence of Batch Split-Complex Backpropagation Algorithm for Complex-Valued Neural Networks
Author(s) -
Huisheng Zhang,
Chao Zhang,
Wei Wu
Publication year - 2009
Publication title -
discrete dynamics in nature and society
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.264
H-Index - 39
eISSN - 1607-887X
pISSN - 1026-0226
DOI - 10.1155/2009/329173
Subject(s) - backpropagation , artificial neural network , sequence (biology) , algorithm , convergence (economics) , monotone polygon , constant (computer programming) , error function , computer science , function (biology) , process (computing) , zero (linguistics) , continuous function (set theory) , rate of convergence , rprop , mathematics , artificial intelligence , recurrent neural network , types of artificial neural networks , key (lock) , philosophy , computer security , economic growth , linguistics , biology , genetics , operating system , geometry , evolutionary biology , programming language , economics
The batch split-complex backpropagation (BSCBP) algorithm for training complex-valued neural networks is considered. For constant learning rate, it is proved that the error function of BSCBP algorithm is monotone during the training iteration process, and the gradient of the error function tends to zero. By adding a moderate condition, the weights sequence itself is also proved to be convergent. A numerical example is given to support the theoretical analysis
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom