LSTM enhanced by dual-attention-based encoder-decoder for daily peak load forecasting

被引:27
|
作者
Zhu, Kedong [1 ]
Li, Yaping [1 ]
Mao, Wenbo [1 ]
Li, Feng [1 ]
Yan, Jiahao [1 ]
机构
[1] China Elect Power Res Inst, Power Automat Dept, Nanrui 8, Nanjing 210003, Jiangsu, Peoples R China
基金
中国国家自然科学基金;
关键词
Dual-attention mechanism; Daily peak load forecasting; Encoder-decoder; Long short-term memory (LSTM); PREDICTION; MODEL;
D O I
10.1016/j.epsr.2022.107860
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Daily peak load forecasting is a challenging problem in the filed of electric power load forecasting. Since the nonlinear and dynamic of influence factors and their sequential dependencies are significant for modeling daily peak load, a prediction model based on long short-term memory (LSTM) enhanced by dual-attention-based encoder-decoder is presented. Functioned as the specific encoder and decoder, LSTM is utilized to participate in the nonlinear dynamic temporal modeling. The encoder-decoder is used for information utilization of both the influence factors and daily peak load. Moreover, a dual-attention mechanism, which is inserted into the encoder decoder, is designed to take into account the effects of different influence factors and time nodes on the daily peak load simultaneously. It is benefit for the above mechanism design to analyze the characteristics of daily peak load precisely and to achieve more accurate prediction results. Comprehensive experiments are performed based on a real set of one provincial capital city in eastern China. The case study shows that the proposed methodology provides the most accurate results with an average MAPE 2.07%, an average RMSE 133 MW and an average MAE 326.6 MW.
引用
收藏
页数:9
相关论文
共 50 条
  • [1] LSTM enhanced by dual-attention-based encoder-decoder for daily peak load forecasting
    Zhu, Kedong
    Li, Yaping
    Mao, Wenbo
    Li, Feng
    Yan, Jiahao
    [J]. Electric Power Systems Research, 2022, 208
  • [2] A Dual Attention Encoder-Decoder Text Summarization Model
    Hakami, Nada Ali
    Mahmoud, Hanan Ahmed Hosni
    [J]. CMC-COMPUTERS MATERIALS & CONTINUA, 2023, 74 (02): : 3697 - 3710
  • [3] Enhanced Attention-Based Encoder-Decoder Framework for Text Recognition
    Prabu, S.
    Sundar, K. Joseph Abraham
    [J]. INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 35 (02): : 2071 - 2086
  • [4] Accurate water quality prediction with attention-based bidirectional LSTM and encoder-decoder
    Bi, Jing
    Chen, Zexian
    Yuan, Haitao
    Zhang, Jia
    [J]. EXPERT SYSTEMS WITH APPLICATIONS, 2024, 238
  • [5] Short-Term Campus Load Forecasting Using CNN-Based Encoder-Decoder Network with Attention
    Ahmed, Zain
    Jamil, Mohsin
    Khan, Ashraf Ali
    [J]. ENERGIES, 2024, 17 (17)
  • [6] Multivariate time series forecasting via attention-based encoder-decoder framework
    Du, Shengdong
    Li, Tianrui
    Yang, Yan
    Horng, Shi-Jinn
    [J]. NEUROCOMPUTING, 2020, 388 : 269 - 279
  • [7] A Hybrid Short-Term Load Forecasting Framework with an Attention-Based Encoder-Decoder Network Based on Seasonal and Trend Adjustment
    Meng, Zhaorui
    Xu, Xianze
    [J]. ENERGIES, 2019, 12 (24)
  • [8] Deep-Learning Forecasting Method for Electric Power Load via Attention-Based Encoder-Decoder with Bayesian Optimization
    Jin, Xue-Bo
    Zheng, Wei-Zhen
    Kong, Jian-Lei
    Wang, Xiao-Yi
    Bai, Yu-Ting
    Su, Ting-Li
    Lin, Seng
    [J]. ENERGIES, 2021, 14 (06)
  • [9] A Novel Dynamic Attack on Classical Ciphers Using an Attention-Based LSTM Encoder-Decoder Model
    Ahmadzadeh, Ezat
    Kim, Hyunil
    Jeong, Ongee
    Moon, Inkyu
    [J]. IEEE ACCESS, 2021, 9 : 60960 - 60970
  • [10] An attention-augmented bidirectional LSTM-based encoder-decoder architecture for electrocardiogram heartbeat classification
    Degachi, Oumayma
    Ouni, Kais
    [J]. TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2024,