A Global Optimization Method RasID-GA for Neural Network Training
Author(s) -
Dongkyu Sohn,
Shingo Mabu,
Kaoru Shimada,
Kotaro Hirasawa,
Jinglu Hu
Publication year - 2008
Publication title -
journal of advanced computational intelligence and intelligent informatics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.172
H-Index - 20
eISSN - 1343-0130
pISSN - 1883-8014
DOI - 10.20965/jaciii.2008.p0085
Subject(s) - computer science , artificial neural network , maxima and minima , genetic algorithm , backpropagation , artificial intelligence , constraint (computer aided design) , stochastic neural network , global optimization , machine learning , training (meteorology) , generalization , time delay neural network , algorithm , mathematics , mathematical analysis , physics , geometry , meteorology
This paper applies an Adaptive Random search with Intensification and Diversification combined with Genetic Algorithm (RasID-GA) to neural network training. In the previous work, we proposed RasID-GA which combines the best properties of RasID and Genetic Algorithm for optimization. Neural networks are widely used in pattern recognition, system modeling, prediction and other areas. Although most neural network training uses gradient based schemes such as well-known back-propagation (BP), but sometimes BP is easily dropped into local minima. In this paper, we train newly developed multi-branch neural networks using RasID-GA with constraint coefficient C by which the feasible solution space is controlled. In addition, we use Mackey-Glass time prediction to test a generalization ability of the proposed method.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom