A multi-step ahead global solar radiation prediction method using an attention-based transformer model with an interpretable mechanism

被引:21
|
作者
Zhou, Yong [1 ,2 ]
Li, Yizhuo [1 ]
Wang, Dengjia [2 ,3 ]
Liu, Yanfeng [2 ,3 ]
机构
[1] Xian Univ Architecture & Technol, Sch Management, 13 Yanta Rd, Xian 710055, Peoples R China
[2] Xi an Univ Architecture & Technol, State Key Lab Green Bldg Western China, 13 Yanta Rd, Xian 710055, Peoples R China
[3] Xian Univ Architecture & Technol, Sch Bldg Serv Sci & Engn, 13 Yanta Rd, Xian 710055, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Multi-step ahead; Global solar radiation; Model predictions; Sequence-to-sequence model; Transformer model; NEURAL-NETWORK; PERSISTENCE; STRATEGIES; FORECASTS; MACHINE; ZONES;
D O I
10.1016/j.ijhydene.2023.01.068
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
The conventional multi-step ahead solar radiation prediction method ignores the time -dependence of a future solar radiation time series. Therefore, according to sequence-to -sequence (seq2seq) model theory, this paper proposes the seq2seq long-and short-term memory model (seq2seq-LSTM), the seq2seq-LSTM model with an attention mechanism (seq2seq-at-LSTM), and a transformer model, which consists only of the attention mech-anism. The hourly global solar radiation data between 2016 and 2018 from Shaanxi, China, is used to train and validate the models. The results show that the introduction of the attention mechanism can effectively improve the prediction accuracy of the seq2seq-LSTM model. However, the model is still not very good at capturing the long-distance depen-dence of the solar radiation time series due to the inherent properties of LSTM. In com-parison, the transformer model, which is based entirely on the attention mechanism, performs much better at capturing the long-distance dependence of the solar radiation time series. Furthermore, as the number of time-steps increases, the performance of the solar radiation prediction decreases relatively smoothly and slowly. The obtained average coefficient of determination, root mean square error (RMSE), relative RMSE, and mean bias error are 0.9788, 72.91 W/m2, 25.25%, and 38.35 W/m2, respectively. In addition, the average skill score of the transformer model is around 44.9%, which is 20.54% higher than that of the seq2seq-at-LSTM model and about 40.84% higher than that of the seq2seq-LSTM model. Besides, the use of the attention mechanism can explain the improved prediction compared to other models. This model developed in this study could also be used for
引用
收藏
页码:15317 / 15330
页数:14
相关论文
共 50 条
  • [41] Research on multi-step ahead prediction method for tool wear based on MSTCN-SBiGRU-MHA
    Xue, Jing
    Cheng, Yaonan
    Zhai, Wenjie
    Zhou, Xingwei
    Zhou, Shilong
    ADVANCED ENGINEERING INFORMATICS, 2025, 65
  • [42] A novel interpretable hybrid model for multi-step ahead dissolved oxygen forecasting in the Mississippi River basin
    Ali, Hayder Mohammed
    Ghaleni, Mehdi Mohammadi
    Moghaddasi, Mahnoosh
    Moradi, Mansour
    STOCHASTIC ENVIRONMENTAL RESEARCH AND RISK ASSESSMENT, 2024, 38 (12) : 4629 - 4656
  • [43] One-Step and Multi-Step Ahead Stock Prediction Using Backpropagation Neural Networks
    Dong, Guanqun
    Fataliyev, Kamaladdin
    Wang, Lipo
    2013 9TH INTERNATIONAL CONFERENCE ON INFORMATION, COMMUNICATIONS AND SIGNAL PROCESSING (ICICS), 2013,
  • [44] An autoencoder wavelet based deep neural network with attention mechanism for multi-step prediction of plant growth
    Alhnaity, Bashar
    Kollias, Stefanos
    Leontidis, Georgios
    Jiang, Shouyong
    Schamp, Bert
    Pearson, Simon
    INFORMATION SCIENCES, 2021, 560 : 35 - 50
  • [45] Enhanced Solar Power Prediction Using Attention-Based DiPLS-BiLSTM Model
    Zhong, Yuanchang
    He, Tengfei
    Mao, Zhongyuan
    ELECTRONICS, 2024, 13 (23):
  • [46] Multi-step ahead modeling of reference evapotranspiration using a multi-model approach
    Nourani, Vahid
    Elkiran, Gozen
    Abdullahi, Jazuli
    JOURNAL OF HYDROLOGY, 2020, 581
  • [47] MAP-FCRNN: Multi-step ahead prediction model using forecasting correction and RNN model with memory functions
    Zhang, Rongtao
    Ma, Xueling
    Ding, Weiping
    Zhan, Jianming
    INFORMATION SCIENCES, 2023, 646
  • [48] A novel transformer-based multi-variable multi-step prediction method for chemical process fault prognosis
    Bai, Yiming
    Zhao, Jinsong
    PROCESS SAFETY AND ENVIRONMENTAL PROTECTION, 2023, 169 : 937 - 947
  • [49] Multi-step solar irradiation prediction based on weather forecast and generative deep learning model
    Gao, Yuan
    Miyata, Shohei
    Akashi, Yasunori
    RENEWABLE ENERGY, 2022, 188 : 637 - 650
  • [50] A Kernel Attention-based Transformer Model for Survival Prediction of Heart Disease Patients
    Kaushal, Palak
    Singh, Shailendra
    Vijayvergiya, Rajesh
    JOURNAL OF CARDIOVASCULAR TRANSLATIONAL RESEARCH, 2024, : 1295 - 1306