Convergence and robustness of bounded recurrent neural networks for solving dynamic Lyapunov equations

被引:32
|
作者
Wang, Guancheng [1 ,2 ]
Hao, Zhihao [1 ]
Zhang, Bob [1 ]
Jin, Long [3 ]
机构
[1] Univ Macau, Dept Comp & Informat Sci, Taipa 999078, Macau, Peoples R China
[2] Guangdong Ocean Univ, Coll Elect & Informat Engn, Zhanjiang 524088, Peoples R China
[3] Chinese Acad Sci, Chongqing Inst Green & Intelligent Technol, Chongqing Key Lab Big Data & Intelligent Comp, Chongqing 400714, Peoples R China
基金
中国国家自然科学基金; 芬兰科学院;
关键词
Recurrent neural network; dynamic Lyapunov equations; Bounded activation functions; Finite-time convergence; Robustness; SYLVESTER EQUATION; MODELS;
D O I
10.1016/j.ins.2021.12.039
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Recurrent neural networks have been reported as an effective approach to solve dynamic Lyapunov equations, which widely exist in various application fields. Considering that a bounded activation function should be imposed on recurrent neural networks to solve the dynamic Lyapunov equation in certain situations, a novel bounded recurrent neural network is defined in this paper. Following the definition, several bounded activation func-tions are proposed, and two of them are used to construct the bounded recurrent neural network for demonstration, where one activation function has a finite-time convergence property and the other achieves robustness against noise. Moreover, theoretical analyses provide rigorous and detailed proof of these superior properties. Finally, extensive simula-tion results, including comparative numerical simulations and two application examples, are demonstrated to verify the effectiveness and feasibility of the proposed bounded recur-rent neural network.(c) 2021 Elsevier Inc. All rights reserved.
引用
收藏
页码:106 / 123
页数:18
相关论文
共 50 条
  • [41] Turing Completeness of Bounded-Precision Recurrent Neural Networks
    Chung, Stephen
    Siegelmann, Hava
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021,
  • [42] A bounded exploration approach to constructive algorithms for recurrent neural networks
    Boné, R
    Crucianu, M
    Verley, G
    de Beauville, JPA
    IJCNN 2000: PROCEEDINGS OF THE IEEE-INNS-ENNS INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOL III, 2000, : 27 - 32
  • [43] Recurrent neural networks training with optimal bounded ellipsoid algorithm
    Rubio, Jose de Jesus
    Yu, Wen
    2007 AMERICAN CONTROL CONFERENCE, VOLS 1-13, 2007, : 4093 - +
  • [44] An accelerated zeroing neural network for solving continuous coupled Lyapunov matrix equations
    Wang, Yurui
    Zhang, Ying
    IET CONTROL THEORY AND APPLICATIONS, 2024, 18 (11): : 1414 - 1423
  • [45] Solving Newton's equations of motion with large timesteps using recurrent neural networks based operators
    Kadupitiya, J. C. S.
    Fox, Geoffrey C.
    Jadhao, Vikram
    MACHINE LEARNING-SCIENCE AND TECHNOLOGY, 2022, 3 (02):
  • [46] Recurrent neural networks employing Lyapunov exponents for analysis of ECG signals
    Ubeyli, Elif Derya
    EXPERT SYSTEMS WITH APPLICATIONS, 2010, 37 (02) : 1192 - 1199
  • [47] Recurrent neural networks for aerodynamic parameter estimation with Lyapunov stability analysis
    George, Sara Mohan
    Selvi, S. S.
    Raol, J. R.
    SYSTEMS SCIENCE & CONTROL ENGINEERING, 2024, 12 (01)
  • [48] Stability Theory by Lyapunov's First Method and Recurrent Neural Networks
    Dano, I.
    PHYSICS OF PARTICLES AND NUCLEI LETTERS, 2008, 5 (03) : 259 - 262
  • [49] Recurrent neural networks employing Lyapunov exponents for EEG signals classification
    Güler, NF
    Übeyli, ED
    Güler, I
    EXPERT SYSTEMS WITH APPLICATIONS, 2005, 29 (03) : 506 - 514
  • [50] Convergence analysis of neural networks for solving a free boundary problem
    Zhao, Xinyue Evelyn
    Hao, Wenrui
    Hu, Bei
    COMPUTERS & MATHEMATICS WITH APPLICATIONS, 2021, 93 : 144 - 155