On the learning machine with compensatory aggregation based neurons in quaternionic domain
Author(s) -
Sushil Kumar,
Bipin Kumar Tripathi
Publication year - 2018
Publication title -
journal of computational design and engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.764
H-Index - 24
eISSN - 2288-5048
pISSN - 2288-4300
DOI - 10.1016/j.jcde.2018.04.002
Subject(s) - nonlinear system , generalization , artificial neural network , benchmark (surveying) , computer science , basis function , basis (linear algebra) , maxima and minima , convergence (economics) , radial basis function , activation function , artificial intelligence , algorithm , mathematics , mathematical analysis , geometry , physics , geodesy , quantum mechanics , economic growth , economics , geography
The nonlinear spatial grouping process of synapses is one of the fascinating methodologies for neuro-computing researchers to achieve the computational power of a neuron. Generally, researchers use neuron models that are based on summation (linear), product (linear) or radial basis (nonlinear) aggregation for the processing of synapses, to construct multi-layered feed-forward neural networks, but all these neuron models and their corresponding neural networks have their advantages or disadvantages. The multi-layered network generally uses for accomplishing the global approximation of input–output mapping but sometimes getting stuck into local minima, while the nonlinear radial basis function (RBF) network is based on exponentially decaying that uses for local approximation to input–output mapping. Their advantages and disadvantages motivated to design two new artificial neuron models based on compensatory aggregation functions in the quaternionic domain. The net internal potentials of these neuron models are developed with the compositions of basic summation (linear) and radial basis (nonlinear) operations on quaternionic-valued input signals. The neuron models based on these aggregation functions ensure faster convergence, better training, and prediction accuracy. The learning and generalization capabilities of these neurons are verified through various three-dimensional transformations and time series predictions as benchmark problems.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom