Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling

被引:13
|
作者
Lu, Wenlian [1 ,2 ,3 ]
Zheng, Ren [3 ]
Chen, Tianping [3 ,4 ]
机构
[1] JinLing Hosp Nanjing, Dept Radiol, Nanjing, Jiangsu, Peoples R China
[2] Fudan Univ, Jinling Hosp, Computat Translat Med Ctr, Ctr Computat Syst Biol, Shanghai 200433, Peoples R China
[3] Fudan Univ, Sch Math Sci, Shanghai 200433, Peoples R China
[4] Fudan Univ, Sch Comp Sci, Shanghai 200433, Peoples R China
关键词
Outer-synchronization; Data sampling; Centralized principle; Decentralized principle; Recurrent neural networks; ABSOLUTE STABILITY; STATE ESTIMATION; EXPONENTIAL SYNCHRONIZATION; MULTIAGENT SYSTEMS; GRADED RESPONSE; STABILIZATION; CONVERGENCE; CONSENSUS;
D O I
10.1016/j.neunet.2015.11.006
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we discuss outer-synchronization of the asymmetrically connected recurrent time-varying neural networks. By using both centralized and decentralized discretization data sampling principles, we derive several sufficient conditions based on three vector norms to guarantee that the difference of any two trajectories starting from different initial values of the neural network converges to zero. The lower bounds of the common time intervals between data samples in centralized and decentralized principles are proved to be positive, which guarantees exclusion of Zeno behavior. A numerical example is provided to illustrate the efficiency of the theoretical results. (C) 2015 Elsevier Ltd. All rights reserved.
引用
收藏
页码:22 / 31
页数:10
相关论文
共 50 条