Minimization Algorithm for Training Feed Forward Neural Network
Author(s) -
Khalil K. Abbo,
Zena T. yaseen
Publication year - 2013
Publication title -
mağallaẗ al-tarbiyaẗ wa-al-ʻilm
Language(s) - English
Resource type - Journals
eISSN - 2664-2530
pISSN - 1812-125X
DOI - 10.33899/edusj.2013.89659
Subject(s) - backpropagation , artificial neural network , computer science , algorithm , minification , function (biology) , quadratic equation , training (meteorology) , artificial intelligence , feedforward neural network , rprop , types of artificial neural networks , mathematics , time delay neural network , physics , geometry , evolutionary biology , meteorology , biology , programming language
In this paper we suggested a new learning rate , which improves the classical Backpropagation algorithm (BP). The derivatation of are based on approximating the error function E to the quadratic one in sufficiently small neighborhood for the optimal weight vector. The suggested algorithm (Spectral Backpropagation SBP say) is tested and the experimental results show that the SBP learning strategy improves the considered methods. 1Introduction The batch training of a feed forward Neural network (FNN) is consistent with the theory of unconstrained optimization [5] and can be viewed as the minimization of the function E; that is to find a minimizer n n R w w w ) ,..., ( * * 1 * Such that: ) ( * w E Min w n R w (1) where E is the batch error measure defined as
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom