z-logo
open-access-imgOpen Access
Two novel finite time convergent recurrent neural networks for tackling complex-valued systems of linear equation
Author(s) -
Lei Ding,
Lin Xiao,
Kailiang Zhou,
YongHong Lan,
Yongsheng Zhang
Publication year - 2020
Publication title -
filomat
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.449
H-Index - 34
eISSN - 2406-0933
pISSN - 0354-5180
DOI - 10.2298/fil2015009d
Subject(s) - activation function , artificial neural network , nonlinear system , convergence (economics) , mathematics , sign (mathematics) , function (biology) , sign function , rate of convergence , power (physics) , computer science , mathematical analysis , artificial intelligence , computer network , channel (broadcasting) , physics , quantum mechanics , evolutionary biology , economics , biology , economic growth
Compared to the linear activation function, a suitable nonlinear activation function can accelerate the convergence speed. Based on this finding, we propose two modified Zhang neural network (ZNN) models using different nonlinear activation functions to tackle the complex-valued systems of linear equation (CVSLE) problems in this paper. To fulfill this goal, we first propose a novel neural network called NRNN-SBP model by introducing the sign-bi-power activation function. Then, we propose another novel neural network called NRNN-IRN model by introducing the tunable activation function. Finally, simulative results demonstrate that the convergence speed of NRNN-SBP and the NRNN-IRN is faster than that of the FTRNN model. On the other hand, these results also reveal that different nonlinear activation function will have a different effect on the convergence rate for different CVSLE problems.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here