On complete stability of recurrent neural networks with time-varying delays and general piecewise linear activation functions

Peng Liu, Wei Xing Zheng, Zhigang Zeng

Research output: Contribution to journalArticlepeer-review

12 Citations (Scopus)

Abstract

This paper addresses the problem of complete stability of delayed recurrent neural networks with a general class of piecewise linear activation functions. By applying an appropriate partition of the state space and iterating the defined bounding functions, some sufficient conditions are obtained to ensure that an n-neuron neural network is completely stable with exactly ∏ⁿi=1(2Ki-1) equilibrium points, among which ∏ⁿi=1Ki equilibrium points are locally exponentially stable and the others are unstable, where Ki (i=1,ł,n) are non-negative integers which depend jointly on activation functions and parameters of neural networks. The results of this paper include the existing works on the stability analysis of recurrent neural networks with piecewise linear functions as special cases and hence can be considered as the improvement and extension of the existing stability results in the literature. A numerical example is provided to illustrate the derived theoretical results.
Original languageEnglish
Pages (from-to)2249-2263
Number of pages15
JournalIEEE Transactions on Cybernetics
Volume50
Issue number5
DOIs
Publication statusPublished - 2020

Keywords

  • design
  • neural networks (computer science)

Fingerprint

Dive into the research topics of 'On complete stability of recurrent neural networks with time-varying delays and general piecewise linear activation functions'. Together they form a unique fingerprint.

Cite this