Robust stability of recurrent neural networks with ISS learning algorithm

被引:17
|
作者
Ahn, Choon Ki [1 ]
机构
[1] Seoul Natl Univ Sci & Technol, Dept Automot Engn, Seoul, South Korea
关键词
Input-to-state stability (ISS) approach; Weight learning algorithm; Dynamic neural networks; Linear matrix inequality (LMI); TO-STATE STABILITY; SMALL-GAIN THEOREM; ABSOLUTE STABILITY; SUFFICIENT CONDITION; INPUT; SYSTEMS; STABILIZATION; IDENTIFICATION;
D O I
10.1007/s11071-010-9901-5
中图分类号
TH [机械、仪表工业];
学科分类号
0802 ;
摘要
In this paper, an input-to-state stability (ISS) approach is used to derive a new robust weight learning algorithm for dynamic neural networks with external disturbance. Based on linear matrix inequality (LMI) formulation, the ISS learning algorithm is presented to not only guarantee exponential stability but also reduce the effect of an external disturbance. It is shown that the design of the ISS learning algorithm can be achieved by solving LMI, which can be easily facilitated by using some standard numerical packages. A numerical example is presented to demonstrate the validity of the proposed learning algorithm.
引用
收藏
页码:413 / 419
页数:7
相关论文
共 50 条
  • [21] Stable Learning Algorithm Using Reducibility for Recurrent Neural Networks
    Satoh, Seiya
    [J]. ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2023, PT VI, 2023, 14259 : 127 - 139
  • [22] Efficient and Robust Supervised Learning Algorithm for Spiking Neural Networks
    Zhang Y.
    Geng T.
    Zhang M.
    Wu X.
    Zhou J.
    Qu H.
    [J]. Sensing and Imaging, 2018, 19 (1):
  • [23] Stability of Recurrent Neural Networks
    Jalab, Hamid A.
    Ibrahim, Rabha W.
    [J]. INTERNATIONAL JOURNAL OF COMPUTER SCIENCE AND NETWORK SECURITY, 2006, 6 (12): : 159 - 164
  • [24] Robust Exponential Stability of Recurrent Neural Networks with Deviating Argument and Stochastic Disturbance
    Zha Mingxin
    Si Wenxiao
    Xie Tao
    [J]. EUROPEAN JOURNAL OF PURE AND APPLIED MATHEMATICS, 2020, 13 (04): : 794 - 806
  • [25] LEARNING IN RECURRENT NEURAL NETWORKS
    WHITE, H
    [J]. MATHEMATICAL SOCIAL SCIENCES, 1991, 22 (01) : 102 - 103
  • [26] Structure and parameter learning algorithm of Jordan type recurrent neural networks
    Huang, Tung-Yung
    Li, C. James
    Hsu, Ting-Wei
    [J]. 2007 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-6, 2007, : 1819 - +
  • [27] An accelerating learning algorithm for block-diagonal recurrent neural networks
    Mastorocostas, Paris
    Varsamis, Dirnitris
    Mastorocostas, Constantinos
    Rekanos, Ioannis
    [J]. INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE FOR MODELLING, CONTROL & AUTOMATION JOINTLY WITH INTERNATIONAL CONFERENCE ON INTELLIGENT AGENTS, WEB TECHNOLOGIES & INTERNET COMMERCE, VOL 2, PROCEEDINGS, 2006, : 403 - +
  • [28] Resilient back propagation learning algorithm for recurrent fuzzy neural networks
    Mastorocostas, PA
    [J]. ELECTRONICS LETTERS, 2004, 40 (01) : 57 - 58
  • [29] A REAL-TIME LEARNING ALGORITHM FOR RECURRENT ANALOG NEURAL NETWORKS
    SATO, M
    [J]. BIOLOGICAL CYBERNETICS, 1990, 62 (03) : 237 - 241
  • [30] Robust learning algorithm for multiplicative neuron model artificial neural networks
    Bas, Eren
    Uslu, Vedide Rezan
    Egrioglu, Erol
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2016, 56 : 80 - 88