Premium
Stability, robustness and approximation properties of gradient recurrent high‐order neural networks
Author(s) -
Kosmatopoulos E. B.,
Christodoulou M. A.
Publication year - 1994
Publication title -
international journal of adaptive control and signal processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.73
H-Index - 66
eISSN - 1099-1115
pISSN - 0890-6327
DOI - 10.1002/acs.4480080408
Subject(s) - robustness (evolution) , content addressable memory , artificial neural network , dynamical systems theory , recurrent neural network , computer science , associative property , stability (learning theory) , mathematics , artificial intelligence , pure mathematics , machine learning , biochemistry , chemistry , physics , quantum mechanics , gene
Gradient recurrent high‐order neural networks (GRHONNs) have found great applicability to optimization, associative memory, adaptive signal‐processing and system identification problems. In this paper we show that these neural networks are asymptotically stable dynamical systems and, moreover, that their solutions remain stable under either deterministic or stochastic disturbances. We also prove that GRHONNs are dense in the space of continuous dynamical systems, contrary to the fact that their vector fields do not satisfy the Stone‐Weierstrass theorem. the significance of these theoretical results to associative memory and optimization and to the fabrication of general‐purpose, large‐scale hardware implementations is discussed.