Convergence and robustness of bounded recurrent neural networks for solving dynamic Lyapunov equations

被引:32
|
作者
Wang, Guancheng [1 ,2 ]
Hao, Zhihao [1 ]
Zhang, Bob [1 ]
Jin, Long [3 ]
机构
[1] Univ Macau, Dept Comp & Informat Sci, Taipa 999078, Macau, Peoples R China
[2] Guangdong Ocean Univ, Coll Elect & Informat Engn, Zhanjiang 524088, Peoples R China
[3] Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China
基金
中国国家自然科学基金; 芬兰科学院;
关键词
Recurrent neural network; dynamic Lyapunov equations; Bounded activation functions; Finite-time convergence; Robustness; SYLVESTER EQUATION; MODELS;
D O I
10.1016/j.ins.2021.12.039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recurrent neural networks have been reported as an effective approach to solve dynamic Lyapunov equations, which widely exist in various application fields. Considering that a bounded activation function should be imposed on recurrent neural networks to solve the dynamic Lyapunov equation in certain situations, a novel bounded recurrent neural network is defined in this paper. Following the definition, several bounded activation func-tions are proposed, and two of them are used to construct the bounded recurrent neural network for demonstration, where one activation function has a finite-time convergence property and the other achieves robustness against noise. Moreover, theoretical analyses provide rigorous and detailed proof of these superior properties. Finally, extensive simula-tion results, including comparative numerical simulations and two application examples, are demonstrated to verify the effectiveness and feasibility of the proposed bounded recur-rent neural network.(c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:106 / 123
页数:18
相关论文
共 50 条
  • [11] Load Balancing with Bounded Convergence in Dynamic Networks
    Dinitz, Michael
    Fineman, Jeremy
    Gilbert, Seth
    Newport, Calvin
    IEEE INFOCOM 2017 - IEEE CONFERENCE ON COMPUTER COMMUNICATIONS, 2017,
  • [12] POPQORN: Quantifying Robustness of Recurrent Neural Networks
    Ko, Ching-Yun
    Lyu, Zhaoyang
    Weng, Tsui-Wei
    Daniel, Luca
    Wong, Ngai
    Lin, Dahua
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [13] Improved Recurrent Neural Networks for Text Classification and Dynamic Sylvester Equation Solving
    Weijie Chen
    Jie Jin
    Dimitrios Gerontitis
    Lixin Qiu
    Jingcan Zhu
    Neural Processing Letters, 2023, 55 : 8755 - 8784
  • [14] Improved Recurrent Neural Networks for Text Classification and Dynamic Sylvester Equation Solving
    Chen, Weijie
    Jin, Jie
    Gerontitis, Dimitrios
    Qiu, Lixin
    Zhu, Jingcan
    NEURAL PROCESSING LETTERS, 2023, 55 (07) : 8755 - 8784
  • [15] RECURRENT NEURAL NETWORKS FOR SOLVING SYSTEMS OF COMPLEX-VALUED LINEAR-EQUATIONS
    WANG, J
    ELECTRONICS LETTERS, 1992, 28 (18) : 1751 - 1753
  • [16] Robust gradient-based neural networks for solving online the discrete periodic Lyapunov matrix equations
    Yin, Chang
    Zhang, Ying
    IET CONTROL THEORY AND APPLICATIONS, 2024, 18 (01): : 71 - 82
  • [17] On the differential equations of recurrent neural networks
    Aouiti, Chaouki
    Ghanmi, Boulbaba
    Miraoui, Mohsen
    INTERNATIONAL JOURNAL OF COMPUTER MATHEMATICS, 2021, 98 (07) : 1385 - 1407
  • [18] Convergence of diagonal recurrent neural networks' learning
    Wang, P
    Li, YF
    Feng, S
    Wei, W
    PROCEEDINGS OF THE 4TH WORLD CONGRESS ON INTELLIGENT CONTROL AND AUTOMATION, VOLS 1-4, 2002, : 2365 - 2369
  • [19] Distributed Solver for Discrete-Time Lyapunov Equations Over Dynamic Networks With Linear Convergence Rate
    Jiang, Xia
    Zeng, Xianlin
    Sun, Jian
    Chen, Jie
    IEEE TRANSACTIONS ON CYBERNETICS, 2022, 52 (02) : 937 - 946
  • [20] A CONVERGENCE RESULT FOR LEARNING IN RECURRENT NEURAL NETWORKS
    KUAN, CM
    HORNIK, K
    WHITE, H
    NEURAL COMPUTATION, 1994, 6 (03) : 420 - 440