Robust local stability of multilayer recurrent neural networks

被引:40
|
作者
Suykens, JAK [1 ]
De Moor, B [1 ]
Vandewalle, J [1 ]
机构
[1] Katholieke Univ Leuven, Dept Elect Engn, ESAT SISTA, B-3001 Heverlee, Belgium
来源
IEEE TRANSACTIONS ON NEURAL NETWORKS | 2000年 / 11卷 / 01期
关键词
dynamic backpropagation; linearization; matrix inequalities; NLq theory;
D O I
10.1109/72.822525
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper we derive a condition for robust local stability of multilayer recurrent neural networks with two hidden layers, The stability condition follows from linking theory about linearization, robustness analysis of linear systems under nonlinear perturbation and matrix inequalities. A characterization of the basin of attraction of the origin is given in terms of the level set of a quadratic Lyapunov function. In a similar way like for NLq theory, local stability is imposed around the origin and the apparent basin of attraction is made large by applying the criterion, while the proven basin of attraction is relatively small due to conservatism of the criterion. Modifying dynamic backpropagation by the new stability condition is discussed and illustrated by simulation examples.
引用
收藏
页码:222 / 229
页数:8
相关论文
共 50 条