A deep model for short-term load forecasting applying a stacked autoencoder based on LSTM supported by a multi-stage attention mechanism

被引:41
|
作者
Fazlipour, Zahra [1 ]
Mashhour, Elaheh [1 ]
Joorabian, Mahmood [1 ]
机构
[1] Shahid Chamran Univ Ahvaz, Fac Engn, Dept Elect Engn, Golestan Ave,POB 6135743337, Ahvaz, Iran
关键词
Deep learning; Attention mechanism; Short-term load forecasting; LSTM; Stacked autoencoder; NEURAL-NETWORK; IMPACT; PREDICTION; BUILDINGS; ALGORITHM; SELECTION; WAVELET; POWER;
D O I
10.1016/j.apenergy.2022.120063
中图分类号
TE [石油、天然气工业]; TK [能源与动力工程];
学科分类号
0807 ; 0820 ;
摘要
This paper presents an innovative univariate Deep LSTM-based Stacked Autoencoder (DLSTM-SAE) model for short-term load forecasting, equipped with a Multi-Stage Attention Mechanism (MSAM), including an input AM and several temporal AM in the pre-training phase. The input AM is used to capture the high-impact load sequence time steps of univariate input data. It should be noted that the model's performance is improved by increasing the network depth; however, finding the optimal network parameters is a challenging task due to the random assignment of the initial weights of the network. An unsupervised greedy layer-wise pre-training structure equipped with the MSAM is expanded to solve setting the random initial weight problem of the DLSTM-SAE model. The multi-stage temporal AM in the pre-training structure leads the DLSTM-SAE to properly learn the time dependencies related to remarkably long sequence input data and capture the temporal merit features lied in the LSTM memory. The performance of the proposed model is evaluated through various comparative tests with current prevalent models using actual energy market data New England ISO using three criteria indexes. The results show the superiority of the proposed model and its robustness in offline and online load forecasting.
引用
收藏
页数:16
相关论文
共 50 条
  • [41] Short-Term Load Forecasting Based on EEMD-WOA-LSTM Combination Model
    Shao, Lei
    Guo, Quanjie
    Li, Chao
    Li, Ji
    Yan, Huilong
    APPLIED BIONICS AND BIOMECHANICS, 2022, 2022
  • [42] A New Hybrid Model Based on SCINet and LSTM for Short-Term Power Load Forecasting
    Liu, Mingping
    Li, Yangze
    Hu, Jiangong
    Wu, Xiaolong
    Deng, Suhui
    Li, Hongqiao
    ENERGIES, 2024, 17 (01)
  • [43] Short-Term Load Forecasting of Microgrid Based on TVFEMD-LSTM-ARMAX Model
    Yin, Yufeng
    Wang, Wenbo
    Yu, Min
    TRANSACTIONS ON ELECTRICAL AND ELECTRONIC MATERIALS, 2024, 25 (03) : 265 - 279
  • [44] Short-term Electric Load Combination Forecasting Model Based on LSTM-LSSVM
    Fang, Lei
    Li, Guoqiang
    Liu, Kun
    Jin, Feng
    Yang, Yuxin
    Guo, Xiao
    2024 6TH ASIA ENERGY AND ELECTRICAL ENGINEERING SYMPOSIUM, AEEES 2024, 2024, : 1168 - 1173
  • [45] Short-term load forecasting using neural attention model based on EMD
    Meng, Zhaorui
    Xie, Yanqi
    Sun, Jinhua
    ELECTRICAL ENGINEERING, 2022, 104 (03) : 1857 - 1866
  • [46] Short-term load forecasting using neural attention model based on EMD
    Zhaorui Meng
    Yanqi Xie
    Jinhua Sun
    Electrical Engineering, 2022, 104 : 1857 - 1866
  • [47] A Hybrid Residential Short-Term Load Forecasting Method Using Attention Mechanism and Deep Learning
    Ji, Xinhui
    Huang, Huijie
    Chen, Dongsheng
    Yin, Kangning
    Zuo, Yi
    Chen, Zhenping
    Bai, Rui
    BUILDINGS, 2023, 13 (01)
  • [48] Short-term power load forecasting model based on multi-strategy improved WOA optimized LSTM
    Liang Q.
    Wang W.
    Wang Y.
    Applied Mathematics and Nonlinear Sciences, 2024, 9 (01)
  • [49] Short-term load forecasting model based on gated recurrent unit and multi-head attention
    Li Hao
    Zhang Linghua
    Tong Cheng
    Zhou Chenyang
    The Journal of China Universities of Posts and Telecommunications, 2023, 30 (03) : 25 - 31
  • [50] Short-term load forecasting model based on gated recurrent unit and multi-head attention
    Hao, Li
    Linghua, Zhang
    Cheng, Tong
    Chenyang, Zhou
    Journal of China Universities of Posts and Telecommunications, 2023, 30 (03): : 25 - 31