Exponential convergence for high-order recurrent neural networks with a class of general activation functions

被引:12
|
作者
Zhang, Hong [1 ]
Wang, Wentao [2 ]
Xiao, Bing [1 ]
机构
[1] Hunan Univ Arts & Sci, Dept Math, Changde 415000, Hunan, Peoples R China
[2] Jiaxing Univ, Coll Math & Informat Sci, Jiaxing 314001, Zhejiang, Peoples R China
关键词
High-order recurrent neural networks; Exponential convergence; Delays; General activation functions; LMI-BASED CRITERIA; STABILITY; BEHAVIOR; DELAYS;
D O I
10.1016/j.apm.2010.05.011
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
In this paper, we consider high-order recurrent neural networks with a class of general activation functions. By using some mathematical analysis techniques, we establish new results to ensure that all solutions of the networks converge exponentially to zero point. (C) 2010 Elsevier Inc. All rights reserved.
引用
收藏
页码:123 / 129
页数:7
相关论文
共 50 条