Exponential stability of delayed recurrent neural networks with Markovian jumping parameters

被引:322
|
作者
Wang, Zidong [1 ]
Liu, Yurong
Yu, Li
Liu, Xiaohui
机构
[1] Brunel Univ, Dept Informat Syst & Comp, Uxbridge UB8 3PH, Middx, England
[2] Yangzhou Univ, Dept Math, Yangzhou 225002, Peoples R China
[3] Zhejiang Univ Technol, Coll Informat Engn, Hangzhou 310014, Peoples R China
基金
英国工程与自然科学研究理事会;
关键词
recurrent neural networks; Markovian jumping parameters; time delays; Stochastic systems; global exponential stability in the mean square; linear matrix inequality;
D O I
10.1016/j.physleta.2006.03.078
中图分类号
O4 [物理学];
学科分类号
0702 ;
摘要
In this Letter, the global exponential stability analysis problem is considered for a class of recurrent neural networks (RNNs) with time delays and Markovian jumping parameters. The jumping parameters considered here are generated from a continuous-time discrete-state homogeneous Markov process, which are governed by a Markov process with discrete and finite state space. The purpose of the problem addressed is to derive some easy-to-test conditions such that the dynamics of the neural network is stochastically exponentially stable in the mean square, independent of the time delay. By employing a new Lyapunov-Krasovskii functional, a linear matrix inequality (LMI) approach is developed to establish the desired sufficient conditions, and therefore the global exponential stability in the mean square for the delayed RNNs can be easily checked by utilizing the numerically efficient Matlab LMI toolbox, and no tuning of parameters is required. A numerical example is exploited to show the usefulness of the derived LMI-based stability conditions. (c) 2006 Elsevier B.V. All fights reserved.
引用
收藏
页码:346 / 352
页数:7
相关论文
共 50 条