A long-term memory enhanced echo state network and its optimization

被引:0
|
作者
Lun, Shuxian [1 ,4 ]
Cai, Jianning [2 ]
Hu, Bo [3 ]
机构
[1] Bohai Univ, Sch Control Sci & Engn, Jinzhou, Peoples R China
[2] Bohai Univ, Sch Math Sci, Jinzhou, Peoples R China
[3] State Grid Dalian Power Supply, Dalian, Peoples R China
[4] Bohai Univ, Sch Control Sci & Engn, Jinzhou 121013, Peoples R China
来源
IET CONTROL THEORY AND APPLICATIONS | 2024年 / 18卷 / 16期
基金
中国国家自然科学基金;
关键词
ESN; LSTM; Optimization; Time series prediction; fuzzy control; nonlinear control systems; TIME-SERIES; NEURAL-NETWORKS; STABILITY ANALYSIS; CHAOTIC SYSTEMS; WIND-SPEED; PREDICTION;
D O I
10.1049/cth2.12591
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
This paper proposes a new type of echo state network called Long-Term Memory Enhanced Echo State Network (LTME-ESN). By extending the ideas of the forget gate and the input gate from the LSTM, LTME-ESN modifies the equation for updating the state in a Leaky Integral Echo State Network (Leaky-ESN). By regulating the accumulation of information, the construction of extended periods of dependency between states may be delayed, allowing for adaptive management of the data transported from the previous state to the present one. In order to optimize parameters by using Stochastic Gradient Descent (SGD), this research presents a necessary condition for LTME-ESN to meet the properties of the echo state network. The study uses low-frequency sinusoidal, high-frequency sinusoidal, and chaotic time series to demonstrate the model's efficiency. Simulations show that LTME-ESN outperforms Leaky-ESN in prediction accuracy and variation. This study provides long-term memory enhanced Echo State Networks (LTME-ESN), a novel and improved Leaky-ESN model. The basic concept is to update the state of neurons in the reservoir by integrating the LSTM's input gate and forget gate concepts into the echo state network. According to the results of all simulation experiments, LTME-ESN model has better prediction accuracy and lower volatility.image
引用
收藏
页码:2116 / 2129
页数:14
相关论文
共 50 条
  • [1] Multiresolution-based Echo State Network and its Application to the long-term Prediction of Network Traffic
    Ge, Qian
    Wei, Chengjian
    PROCEEDINGS OF THE 2008 INTERNATIONAL SYMPOSIUM ON COMPUTATIONAL INTELLIGENCE AND DESIGN, VOL 1, 2008, : 469 - 472
  • [2] Medium and Long-Term Fault Prediction of Avionics Based on Echo State Network
    Gao, Chi
    Li, Bin
    Dai, Zhen
    MOBILE INFORMATION SYSTEMS, 2022, 2022
  • [3] Evolutionary echo state network for long-term time series prediction: on the edge of chaos
    Zhang, Gege
    Zhang, Chao
    Zhang, WeiDong
    APPLIED INTELLIGENCE, 2020, 50 (03) : 893 - 904
  • [4] Evolutionary echo state network for long-term time series prediction: on the edge of chaos
    Gege Zhang
    Chao Zhang
    WeiDong Zhang
    Applied Intelligence, 2020, 50 : 893 - 904
  • [5] Optical memory based on the long-term photon echo phenomenon
    Akhmediev, N
    JOURNAL OF LUMINESCENCE, 1995, 66-7 (1-6) : 74 - 77
  • [6] Memory-Enhanced Evolutionary Robotics: The Echo State Network Approach
    Hartland, Cedric
    Bredeche, Nicolas
    Sebag, Michele
    2009 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION, VOLS 1-5, 2009, : 2788 - 2795
  • [7] A New PSOGSA Inspired Convolutional Echo State Network for Long-term Health Status Prediction
    Zhang, Gege
    Zhang, Chao
    Li, Zheqing
    Zhang, Weidong
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2018, : 1298 - 1303
  • [8] Delayed self-feedback echo state network for long-term dynamics of hyperchaotic systems
    Xu, Xu
    Liu, Jianming
    Li, Eric
    PHYSICAL REVIEW E, 2024, 109 (06)
  • [9] Long-term ENSO prediction with echo-state networks
    Hassanibesheli, Forough
    Kurths, Juergen
    Boers, Niklas
    ENVIRONMENTAL RESEARCH-CLIMATE, 2022, 1 (01):
  • [10] Enhanced declarative memory in long-term mindfulness practitioners
    Limor Shemesh
    Avi Mendelsohn
    Daniel Yochai Panitz
    Aviva Berkovich-Ohana
    Psychological Research, 2023, 87 : 294 - 307