Premium
Reformulated radial basis function neural networks with adjustable weighted norms
Author(s) -
RandolphGips Mary M.,
Karayiannis Nicolaos B.
Publication year - 2003
Publication title -
international journal of intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.291
H-Index - 87
eISSN - 1098-111X
pISSN - 0884-8173
DOI - 10.1002/int.10133
Subject(s) - radial basis function , artificial neural network , computer science , feedforward neural network , euclidean distance , artificial intelligence , hierarchical rbf , norm (philosophy) , activation function , function (biology) , basis (linear algebra) , radial basis function network , algorithm , mathematics , geometry , evolutionary biology , political science , law , biology
This article presents a new family of reformulated radial basis function (RBF) neural networks that employ adjustable weighted norms to measure the distance between the training vectors and the centers of the radial basis functions. The reformulated RBF model introduced in this article incorporates norm weights that can be updated during learning to facilitate the implementation of the desired input‐output mapping. Experiments involving classification and function approximation tasks verify that the proposed RBF neural networks outperform conventional RBF neural networks and reformulated RBF neural networks employing fixed Euclidean norms. Reformulated RBF neural networks with adjustable weighted norms are also strong competitors to conventional feedforward neural networks in terms of performance, implementation simplicity, and training speed. © 2003 Wiley Periodicals, Inc.