Delay-dependent exponential stability of recurrent neural networks with Markovian jumping parameters and proportional delays

被引:11
|
作者
Zhou, Liqun [1 ]
机构
[1] Tianjin Normal Univ, Sch Math Sci, Tianjin 300387, Peoples R China
来源
基金
美国国家科学基金会;
关键词
Recurrent neural networks (RNNs); Proportional delays; Markovian jumping parameters; Global exponential stability; Linear matrix inequality (LMI); TIME-VARYING DELAYS; GLOBAL ASYMPTOTIC STABILITY; VARIABLE-COEFFICIENTS; DISTRIBUTED DELAYS; LMI APPROACH; SYNCHRONIZATION; EQUATIONS; DISCRETE; CRITERIA;
D O I
10.1007/s00521-016-2370-0
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper deals with the global exponential stability problem of a class of recurrent neural networks with Markovian jumping parameters and proportional delays. Here the proportional delay is unbounded time-varying, which is different from unbounded distributed delay. The nonlinear transformation z(t) - x(e(t)) transforms the recurrent neural networks with Markovian jumping parameters and proportional delays into the recurrent neural networks with Markovian jumping parameters, constant delays and variable coefficients. By constructing Lyapunov functional, a linear matrix inequality (LMI) approach is developed to establish a new delay-dependent global exponential stability sufficient condition in the mean square, which is related to the size of the proportional delay factor and can be easily checked by utilizing the numerically efficient MATLAB LMI toolbox, and no tuning of parameters is required. Two numerical examples and their simulations are given to illustrate the effectiveness of the obtained results.
引用
收藏
页码:S765 / S773
页数:9
相关论文
共 50 条