A Transformer-Based Framework for Parameter Learning of a Land Surface Hydrological Process Model

被引:1
|
作者
Li, Klin [1 ]
Lu, Yutong [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
parameters calibration; transformer; SMAP observation; soil moisture prediction; deep learning; MODIS evapotranspiration data; SOIL-MOISTURE; OPTIMIZATION; CALIBRATION; DECADES; WATER;
D O I
10.3390/rs15143536
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The effective representation of land surface hydrological models strongly relies on spatially varying parameters that require calibration. Well-calibrated physical models can effectively propagate observed information to unobserved variables, but traditional calibration methods often result in nonunique solutions. In this paper, we propose a hydrological parameter calibration training framework consisting of a transformer-based parameter learning model (ParaFormer) and a surrogate model based on LSTM. On the one hand, ParaFormer utilizes self-attention mechanisms to learn a global mapping from observed data to the parameters to be calibrated, which captures spatial correlations. On the other hand, the surrogate model takes the calibrated parameters as inputs and simulates the observable variables, such as soil moisture, overcoming the challenges of directly combining complex hydrological models with a deep learning (DL) platform in a hybrid training scheme. Using the variable infiltration capacity model as the reference, we test the performance of ParaFormer on datasets of different resolutions. The results demonstrate that, in predicting soil moisture and transferring calibrated parameters in the task of evapotranspiration prediction, ParaFormer learns more effective and robust parameter mapping patterns compared to traditional and state-of-the-art DL-based parameter calibration methods.
引用
收藏
页数:18
相关论文
共 50 条
  • [41] Vision Transformer-Based Photovoltaic Prediction Model
    Kang, Zaohui
    Xue, Jizhong
    Lai, Chun Sing
    Wang, Yu
    Yuan, Haoliang
    Xu, Fangyuan
    ENERGIES, 2023, 16 (12)
  • [42] Transformer-Based Model for Electrical Load Forecasting
    L'Heureux, Alexandra
    Grolinger, Katarina
    Capretz, Miriam A. M.
    ENERGIES, 2022, 15 (14)
  • [43] Transformer-Based Model for Auditory EEG Decoding
    Chen, Jiaxin
    Liu, Yin-Long
    Feng, Rui
    Yuan, Jiahong
    Ling, Zhen-Hua
    MAN-MACHINE SPEECH COMMUNICATION, NCMMSC 2024, 2025, 2312 : 129 - 143
  • [44] Research on Parameter Regionalization of Distributed Hydrological Model Based on Machine Learning
    Wang, Wenchuan
    Zhao, Yanwei
    Tu, Yong
    Dong, Rui
    Ma, Qiang
    Liu, Changjun
    WATER, 2023, 15 (03)
  • [45] Estimating finger joint angles by surface EMG signal using feature extraction and transformer-based deep learning model
    Putro, Nur Achmad Sulistyo
    Avian, Cries
    Prakosa, Setya Widyawan
    Mahali, Muhammad Izzuddin
    Leu, Jenq-Shiou
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 87
  • [46] Estimating finger joint angles by surface EMG signal using feature extraction and transformer-based deep learning model
    Putro, Nur Achmad Sulistyo
    Avian, Cries
    Prakosa, Setya Widyawan
    Mahali, Muhammad Izzuddin
    Leu, Jenq-Shiou
    Biomedical Signal Processing and Control, 2024, 87
  • [47] Transformer-based Question Text Generation in the Learning System
    Li, Jiajun
    Song, Huazhu
    Li, Jun
    6TH INTERNATIONAL CONFERENCE ON INNOVATION IN ARTIFICIAL INTELLIGENCE, ICIAI2022, 2022, : 50 - 56
  • [48] Learning Confidence for Transformer-based Neural Machine Translation
    Lu, Yu
    Zeng, Jiali
    Zhang, Jiajun
    Wu, Shuangzhi
    Li, Mu
    PROCEEDINGS OF THE 60TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2022), VOL 1: (LONG PAPERS), 2022, : 2353 - 2364
  • [49] Transformer-Based Federated Learning Models for Recommendation Systems
    Reddy, M. Sujaykumar
    Karnati, Hemanth
    Sundari, L. Mohana
    IEEE ACCESS, 2024, 12 : 109596 - 109607
  • [50] Transformer-Based Representation Learning on Temporal Heterogeneous Graphs
    Li, Longhai
    Duan, Lei
    Wang, Junchen
    Xie, Guicai
    He, Chengxin
    Chen, Zihao
    Deng, Song
    WEB AND BIG DATA, PT II, APWEB-WAIM 2022, 2023, 13422 : 385 - 400