Premium
Network training and architecture optimization by a recursive approach and a modified genetic algorithm
Author(s) -
Jiang JianHui,
Wang JiHong,
Song XinHua,
Yu RuQin
Publication year - 1996
Publication title -
journal of chemometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.47
H-Index - 92
eISSN - 1099-128X
pISSN - 0886-9383
DOI - 10.1002/(sici)1099-128x(199605)10:3<253::aid-cem420>3.0.co;2-z
Subject(s) - crossover , computer science , genetic algorithm , convergence (economics) , mathematical optimization , algorithm , population based incremental learning , artificial neural network , feedforward neural network , feed forward , mathematics , artificial intelligence , engineering , economics , economic growth , control engineering
A recursive algorithm for optimizing the architecture of feedforward neural networks by the stepwise addition of a reasonable number of hidden nodes is proposed. The recursive algorithm retains the calculation results and approximation precision already obtained in the previous iteration step and uses them in the next step to efficiently lighten the computational burden of network optimization and training. The commonly used genetic algorithm has been modified for network training to circumvent the local optimum problem. Some new genetic operators, competition and self‐reproduction, have been introduced and used together with some substantially modified genetic operators, crossover and mutation, to form a modified genetic algorithm (MGA) which ensures asymptotic convergence to the global optima with relatively high efficiency. The proposed methods have been successfully applied to concentration estimation in chemical analysis and quantitative structure‐activity relationship studies of chemical compounds.