A note on exponential convergence of neural networks with unbounded distributed delays

被引:7
|
作者
Chu, Tianguang [1 ]
Yang, Haifeng [1 ]
机构
[1] Peking Univ, Dept Mech & Engn Sci, Ctr Syst & Control, Intelligent Control Lab, Beijing 100871, Peoples R China
关键词
D O I
10.1016/j.chaos.2006.04.033
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
This note examines issues concerning global exponential convergence of neural networks with unbounded distributed delays. Sufficient conditions are derived by exploiting exponentially fading memory property of delay kernel functions. The method is based on comparison principle of delay differential equations and does not need the construction of any Lyapunov functionals. It is simple yet effective in deriving less conservative exponential convergence conditions and more detailed componentwise decay estimates. The results of this note and [Chu T. An exponential convergence estimate for analog neural networks with delay. Phys Lett A 2001;283:113-8] suggest a class of neural networks whose globally exponentially convergent dynamics is completely insensitive to a wide range of time delays from arbitrary bounded discrete type to certain unbounded distributed type. This is of practical interest in designing fast and reliable neural circuits. Finally, an open question is raised on the nature of delay kernels for attaining exponential convergence in an unbounded distributed delayed neural network. (c) 2006 Elsevier Ltd. All rights reserved.
引用
收藏
页码:1538 / 1545
页数:8
相关论文
共 50 条