A multi-step ahead global solar radiation prediction method using an attention-based transformer model with an interpretable mechanism

被引:21
|
作者
Zhou, Yong [1 ,2 ]
Li, Yizhuo [1 ]
Wang, Dengjia [2 ,3 ]
Liu, Yanfeng [2 ,3 ]
机构
[1] Xian Univ Architecture & Technol, Sch Management, 13 Yanta Rd, Xian 710055, Peoples R China
[2] Xi an Univ Architecture & Technol, State Key Lab Green Bldg Western China, 13 Yanta Rd, Xian 710055, Peoples R China
[3] Xian Univ Architecture & Technol, Sch Bldg Serv Sci & Engn, 13 Yanta Rd, Xian 710055, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Multi-step ahead; Global solar radiation; Model predictions; Sequence-to-sequence model; Transformer model; NEURAL-NETWORK; PERSISTENCE; STRATEGIES; FORECASTS; MACHINE; ZONES;
D O I
10.1016/j.ijhydene.2023.01.068
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
The conventional multi-step ahead solar radiation prediction method ignores the time -dependence of a future solar radiation time series. Therefore, according to sequence-to -sequence (seq2seq) model theory, this paper proposes the seq2seq long-and short-term memory model (seq2seq-LSTM), the seq2seq-LSTM model with an attention mechanism (seq2seq-at-LSTM), and a transformer model, which consists only of the attention mech-anism. The hourly global solar radiation data between 2016 and 2018 from Shaanxi, China, is used to train and validate the models. The results show that the introduction of the attention mechanism can effectively improve the prediction accuracy of the seq2seq-LSTM model. However, the model is still not very good at capturing the long-distance depen-dence of the solar radiation time series due to the inherent properties of LSTM. In com-parison, the transformer model, which is based entirely on the attention mechanism, performs much better at capturing the long-distance dependence of the solar radiation time series. Furthermore, as the number of time-steps increases, the performance of the solar radiation prediction decreases relatively smoothly and slowly. The obtained average coefficient of determination, root mean square error (RMSE), relative RMSE, and mean bias error are 0.9788, 72.91 W/m2, 25.25%, and 38.35 W/m2, respectively. In addition, the average skill score of the transformer model is around 44.9%, which is 20.54% higher than that of the seq2seq-at-LSTM model and about 40.84% higher than that of the seq2seq-LSTM model. Besides, the use of the attention mechanism can explain the improved prediction compared to other models. This model developed in this study could also be used for
引用
收藏
页码:15317 / 15330
页数:14
相关论文
共 50 条
  • [31] Multi-step ahead prediction of lake water temperature using neural network and physically-based model
    Chen, Chuqiang
    Xue, Xinhua
    JOURNAL OF HYDRAULIC RESEARCH, 2024, 62 (04) : 370 - 382
  • [32] Interpretable attention-based multi-encoder transformer based QSPR model for assessing toxicity and environmental impact of chemicals
    Kim S.
    Tariq S.
    Heo S.
    Yoo C.
    Chemosphere, 2024, 350
  • [33] PredictPTB: an interpretable preterm birth prediction model using attention-based recurrent neural networks
    AlSaad, Rawan
    Malluhi, Qutaibah
    Boughorbel, Sabri
    BIODATA MINING, 2022, 15 (01)
  • [34] PredictPTB: an interpretable preterm birth prediction model using attention-based recurrent neural networks
    Rawan AlSaad
    Qutaibah Malluhi
    Sabri Boughorbel
    BioData Mining, 15
  • [35] Multi-step ahead wind speed prediction based on a two-step decomposition technique and prediction model parameter optimization
    Wang, He
    Xiong, Min
    Chen, Hongfeng
    Liu, Sumei
    ENERGY REPORTS, 2022, 8 : 6086 - 6100
  • [36] Attention-based Multi-step Short-term Passenger Flow Spatial-temporal Integrated Prediction Model in URT Systems
    Zhang J.
    Chen Y.
    Panchamy K.
    Jin G.
    Wang C.
    Yang L.
    Journal of Geo-Information Science, 2023, 25 (04) : 668 - 713
  • [37] A spatiotemporal hierarchical attention mechanism-based model for multi-step station-level crowd flow prediction
    Zhou, Yirong
    Li, Jun
    Chen, Hao
    Wu, Ye
    Wu, Jiangjiang
    Chen, Luo
    INFORMATION SCIENCES, 2021, 544 : 308 - 324
  • [38] An Interpretable Machine Learning Model for Daily Global Solar Radiation Prediction
    Chaibi, Mohamed
    Benghoulam, El Mahjoub
    Tarik, Lhoussaine
    Berrada, Mohamed
    El Hmaidi, Abdellah
    ENERGIES, 2021, 14 (21)
  • [39] A multi-step decision prediction model based on LightGBM
    Luo, Yuhao
    Xu, Qianfang
    Li, Wenliang
    Jiang, Feng
    Xiao, Bo
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 5714 - 5718
  • [40] Multi-step ahead prediction for electromechanical device using Multivariate SVM predictor
    Zhang, Zhengkai
    Gu, Lichen
    Zhang, Ping
    MECHATRONICS AND INTELLIGENT MATERIALS III, PTS 1-3, 2013, 706-708 : 878 - 881