TY - JOUR
T1 - On complete stability of recurrent neural networks with time-varying delays and general piecewise linear activation functions
AU - Liu, Peng
AU - Zheng, Wei Xing
AU - Zeng, Zhigang
PY - 2020
Y1 - 2020
N2 - This paper addresses the problem of complete stability of delayed recurrent neural networks with a general class of piecewise linear activation functions. By applying an appropriate partition of the state space and iterating the defined bounding functions, some sufficient conditions are obtained to ensure that an n-neuron neural network is completely stable with exactly ∏ⁿi=1(2Ki-1) equilibrium points, among which ∏ⁿi=1Ki equilibrium points are locally exponentially stable and the others are unstable, where Ki (i=1,ł,n) are non-negative integers which depend jointly on activation functions and parameters of neural networks. The results of this paper include the existing works on the stability analysis of recurrent neural networks with piecewise linear functions as special cases and hence can be considered as the improvement and extension of the existing stability results in the literature. A numerical example is provided to illustrate the derived theoretical results.
AB - This paper addresses the problem of complete stability of delayed recurrent neural networks with a general class of piecewise linear activation functions. By applying an appropriate partition of the state space and iterating the defined bounding functions, some sufficient conditions are obtained to ensure that an n-neuron neural network is completely stable with exactly ∏ⁿi=1(2Ki-1) equilibrium points, among which ∏ⁿi=1Ki equilibrium points are locally exponentially stable and the others are unstable, where Ki (i=1,ł,n) are non-negative integers which depend jointly on activation functions and parameters of neural networks. The results of this paper include the existing works on the stability analysis of recurrent neural networks with piecewise linear functions as special cases and hence can be considered as the improvement and extension of the existing stability results in the literature. A numerical example is provided to illustrate the derived theoretical results.
KW - design
KW - neural networks (computer science)
UR - http://hdl.handle.net/1959.7/uws:50967
U2 - 10.1109/TCYB.2018.2884836
DO - 10.1109/TCYB.2018.2884836
M3 - Article
SN - 2168-2267
VL - 50
SP - 2249
EP - 2263
JO - IEEE Transactions on Cybernetics
JF - IEEE Transactions on Cybernetics
IS - 5
ER -