A multi-step ahead global solar radiation prediction method using an attention-based transformer model with an interpretable mechanism

被引:21
|
作者
Zhou, Yong [1 ,2 ]
Li, Yizhuo [1 ]
Wang, Dengjia [2 ,3 ]
Liu, Yanfeng [2 ,3 ]
机构
[1] Xian Univ Architecture & Technol, Sch Management, 13 Yanta Rd, Xian 710055, Peoples R China
[2] Xi an Univ Architecture & Technol, State Key Lab Green Bldg Western China, 13 Yanta Rd, Xian 710055, Peoples R China
[3] Xian Univ Architecture & Technol, Sch Bldg Serv Sci & Engn, 13 Yanta Rd, Xian 710055, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
Multi-step ahead; Global solar radiation; Model predictions; Sequence-to-sequence model; Transformer model; NEURAL-NETWORK; PERSISTENCE; STRATEGIES; FORECASTS; MACHINE; ZONES;
D O I
10.1016/j.ijhydene.2023.01.068
中图分类号
O64 [物理化学(理论化学)、化学物理学];
学科分类号
070304 ; 081704 ;
摘要
The conventional multi-step ahead solar radiation prediction method ignores the time -dependence of a future solar radiation time series. Therefore, according to sequence-to -sequence (seq2seq) model theory, this paper proposes the seq2seq long-and short-term memory model (seq2seq-LSTM), the seq2seq-LSTM model with an attention mechanism (seq2seq-at-LSTM), and a transformer model, which consists only of the attention mech-anism. The hourly global solar radiation data between 2016 and 2018 from Shaanxi, China, is used to train and validate the models. The results show that the introduction of the attention mechanism can effectively improve the prediction accuracy of the seq2seq-LSTM model. However, the model is still not very good at capturing the long-distance depen-dence of the solar radiation time series due to the inherent properties of LSTM. In com-parison, the transformer model, which is based entirely on the attention mechanism, performs much better at capturing the long-distance dependence of the solar radiation time series. Furthermore, as the number of time-steps increases, the performance of the solar radiation prediction decreases relatively smoothly and slowly. The obtained average coefficient of determination, root mean square error (RMSE), relative RMSE, and mean bias error are 0.9788, 72.91 W/m2, 25.25%, and 38.35 W/m2, respectively. In addition, the average skill score of the transformer model is around 44.9%, which is 20.54% higher than that of the seq2seq-at-LSTM model and about 40.84% higher than that of the seq2seq-LSTM model. Besides, the use of the attention mechanism can explain the improved prediction compared to other models. This model developed in this study could also be used for
引用
收藏
页码:15317 / 15330
页数:14
相关论文
共 50 条
  • [1] Dual attention-based multi-step ahead prediction enhancement for monitoring systems in industrial processes
    An, Nahyeon
    Hong, Seokyoung
    Kim, Yurim
    Cho, Hyungtae
    Lim, Jongkoo
    Moon, Il
    Kim, Junghwan
    APPLIED SOFT COMPUTING, 2023, 147
  • [2] Attention-Based Models for Multivariate Time Series Forecasting: Multi-step Solar Irradiation Prediction
    Sakib, Sadman
    Mahadi, Mahin K.
    Abir, Samiur R.
    Moon, Al-Muzadded
    Shafiullah, Ahmad
    Ali, Sanjida
    Faisal, Fahim
    Nishat, Mirza M.
    HELIYON, 2024, 10 (06)
  • [3] Multi-step solar radiation prediction using transformer: A case study from solar radiation data in Tokyo
    Dong, Huagang
    Tang, Pengwei
    He, Bo
    Chen, Lei
    Zhang, Zhuangzhuang
    Jia, Chengqi
    JOURNAL OF BUILDING PHYSICS, 2024, 47 (04) : 421 - 438
  • [4] Multi-step ahead forecasting of global solar radiation for arid zones using deep learning
    Chandola, Deeksha
    Gupta, Harsh
    Tikkiwal, Vinay Anand
    Bohra, Manoj Kumar
    INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND DATA SCIENCE, 2020, 167 : 626 - 635
  • [5] Tool health monitoring and prediction via attention-based encoder-decoder with a multi-step mechanism
    Guo, Baosu
    Zhang, Qin
    Peng, Qinjing
    Zhuang, Jichao
    Wu, Fenghe
    Zhang, Quan
    INTERNATIONAL JOURNAL OF ADVANCED MANUFACTURING TECHNOLOGY, 2022, 122 (02): : 685 - 695
  • [6] Tool health monitoring and prediction via attention-based encoder-decoder with a multi-step mechanism
    Baosu Guo
    Qin Zhang
    Qinjing Peng
    Jichao Zhuang
    Fenghe Wu
    Quan Zhang
    The International Journal of Advanced Manufacturing Technology, 2022, 122 : 685 - 695
  • [7] A novel multi-step ahead solar power prediction scheme by deep learning on transformer structure
    Mo, Fan
    Jiao, Xuan
    Li, Xingshuo
    Du, Yang
    Yao, Yunting
    Meng, Yuxiang
    Ding, Shuye
    RENEWABLE ENERGY, 2024, 230
  • [8] Multi-Step Ahead Fault Prediction Method Based on PCA and EMD
    Wang, Shu
    Zhao, Zhen
    Wang, Fuli
    Chang, Yuqing
    2011 CHINESE CONTROL AND DECISION CONFERENCE, VOLS 1-6, 2011, : 2874 - +
  • [9] Interpretable local flow attention for multi-step traffic flow prediction
    Huang, Xu
    Zhang, Bowen
    Feng, Shanshan
    Ye, Yunming
    Li, Xutao
    NEURAL NETWORKS, 2023, 161 : 25 - 38
  • [10] Interpretable LSTM Based on Mixture Attention Mechanism for Multi-Step Residential Load Forecasting
    Xu, Chongchong
    Li, Chaojie
    Zhou, Xiaojun
    ELECTRONICS, 2022, 11 (14)