Multimodal Fusion of Optimized GRU-LSTM with Self-Attention Layer for Hydrological Time Series Forecasting

被引:0
|
作者
Kilinc, Huseyin Cagan [1 ]
Apak, Sina [2 ]
Ozkan, Furkan [3 ]
Ergin, Mahmut Esad [1 ]
Yurtsever, Adem [4 ]
机构
[1] Istanbul Aydin Univ, Dept Civil Engn, Istanbul, Turkiye
[2] Istanbul Aydin Univ, Dept Management Informat Technol, Istanbul, Turkiye
[3] Cukurova Univ, Dept Comp Engn, Adana, Turkiye
[4] Istanbul Univ Cerrahpasa, Dept Environm Engn, Istanbul, Turkiye
关键词
Streamflow forecasting; Particle Swarm Optimization; Bidirectional Long Short-Term Memory; Bidirectional Gated Recurrent Unit; Fusion features; Attention layers; MACHINE; MECHANISM; MODEL;
D O I
10.1007/s11269-024-03943-4
中图分类号
TU [建筑科学];
学科分类号
0813 ;
摘要
Accurate flow forecasting is crucial for effective basin management, regional agricultural policy development, environmental impact analysis, soil and water conservation studies, and flood protection planning. This study proposes a novel approach that integrates particle swarm optimization (PSO) with bidirectional long short-term memory (Bi-LSTM) and bidirectional gated recurrent unit (Bi-GRU) architectures, augmented by feature fusion and attention layers. Our approach consistently outperforms traditional methods across multiple datasets, including Ahmethac & imath;, B & uuml;y & uuml;kincirli, and Ersil, thereby achieving lower RMSE, MAE, and higher KGE and BF scores. Specifically, in Ahmethac & imath;, our method yields an RMSE of 3.448, MAE of 1.224, and an R2 of 0.886. In B & uuml;y & uuml;kincirli, it records an RMSE of 0.085, MAE of 0.040, and an R2 of 0.964. In Ersil, it achieves an RMSE of 1.495, MAE of 0.565, and R2 of 0.883. These results underscore the effectiveness of the proposed approach in flow forecasting.
引用
收藏
页码:6045 / 6062
页数:18
相关论文
共 50 条
  • [1] Bridging Self-Attention and Time Series Decomposition for Periodic Forecasting
    Jiang, Song
    Syed, Tahin
    Zhu, Xuan
    Levy, Joshua
    Aronchik, Boris
    Sun, Yizhou
    [J]. PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 3202 - 3211
  • [2] DSANet: Dual Self-Attention Network for Multivariate Time Series Forecasting
    Huang, Siteng
    Wang, Donglin
    Wu, Xuehan
    Tang, Ao
    [J]. PROCEEDINGS OF THE 28TH ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM '19), 2019, : 2129 - 2132
  • [3] Multimodal Fusion Method Based on Self-Attention Mechanism
    Zhu, Hu
    Wang, Ze
    Shi, Yu
    Hua, Yingying
    Xu, Guoxia
    Deng, Lizhen
    [J]. WIRELESS COMMUNICATIONS & MOBILE COMPUTING, 2020, 2020 (2020):
  • [4] SATNet: Upgraded LSTM Network For Mining Time Series Correlation Utilizing The Self-Attention Mechanism
    Dai, Xiangyu
    Zou, Quan
    [J]. 2023 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, IJCNN, 2023,
  • [5] Residential load forecasting based on LSTM fusing self-attention mechanism with pooling
    Zang, Haixiang
    Xu, Ruiqi
    Cheng, Lilin
    Ding, Tao
    Liu, Ling
    Wei, Zhinong
    Sun, Guoqiang
    [J]. ENERGY, 2021, 229
  • [7] Time Series Self-Attention Approach for Human Motion Forecasting: A Baseline 2D Pose Forecasting
    Yunus, Andi Prademon
    Morita, Kento
    Shirai, Nobu C.
    Wakabayashi, Tetsushi
    [J]. JOURNAL OF ADVANCED COMPUTATIONAL INTELLIGENCE AND INTELLIGENT INFORMATICS, 2023, 27 (03) : 445 - 457
  • [8] LSTM with spatiotemporal attention for IoT-based wireless sensor collected hydrological time-series forecasting
    Jianying Huang
    Jinhui Li
    Jeill Oh
    Hoon Kang
    [J]. International Journal of Machine Learning and Cybernetics, 2023, 14 : 3337 - 3352
  • [9] Improving time series forecasting using LSTM and attention models
    Abbasimehr, Hossein
    Paki, Reza
    [J]. JOURNAL OF AMBIENT INTELLIGENCE AND HUMANIZED COMPUTING, 2022, 13 (01) : 673 - 691
  • [10] Improving time series forecasting using LSTM and attention models
    Hossein Abbasimehr
    Reza Paki
    [J]. Journal of Ambient Intelligence and Humanized Computing, 2022, 13 : 673 - 691