A Transformer-Based Framework for Parameter Learning of a Land Surface Hydrological Process Model

被引:1
|
作者
Li, Klin [1 ]
Lu, Yutong [1 ]
机构
[1] Sun Yat Sen Univ, Sch Comp Sci & Engn, Guangzhou 510006, Peoples R China
基金
中国国家自然科学基金;
关键词
parameters calibration; transformer; SMAP observation; soil moisture prediction; deep learning; MODIS evapotranspiration data; SOIL-MOISTURE; OPTIMIZATION; CALIBRATION; DECADES; WATER;
D O I
10.3390/rs15143536
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
The effective representation of land surface hydrological models strongly relies on spatially varying parameters that require calibration. Well-calibrated physical models can effectively propagate observed information to unobserved variables, but traditional calibration methods often result in nonunique solutions. In this paper, we propose a hydrological parameter calibration training framework consisting of a transformer-based parameter learning model (ParaFormer) and a surrogate model based on LSTM. On the one hand, ParaFormer utilizes self-attention mechanisms to learn a global mapping from observed data to the parameters to be calibrated, which captures spatial correlations. On the other hand, the surrogate model takes the calibrated parameters as inputs and simulates the observable variables, such as soil moisture, overcoming the challenges of directly combining complex hydrological models with a deep learning (DL) platform in a hybrid training scheme. Using the variable infiltration capacity model as the reference, we test the performance of ParaFormer on datasets of different resolutions. The results demonstrate that, in predicting soil moisture and transferring calibrated parameters in the task of evapotranspiration prediction, ParaFormer learns more effective and robust parameter mapping patterns compared to traditional and state-of-the-art DL-based parameter calibration methods.
引用
收藏
页数:18
相关论文
共 50 条
  • [1] A Transformer-based Framework for Multivariate Time Series Representation Learning
    Zerveas, George
    Jayaraman, Srideepika
    Patel, Dhaval
    Bhamidipaty, Anuradha
    Eickhoff, Carsten
    KDD '21: PROCEEDINGS OF THE 27TH ACM SIGKDD CONFERENCE ON KNOWLEDGE DISCOVERY & DATA MINING, 2021, : 2114 - 2124
  • [2] A transformer-based deep learning framework to predict employee attrition
    Li, Wenhui
    PEERJ COMPUTER SCIENCE, 2023, 9
  • [3] Transformer-based contrastive learning framework for image anomaly detection
    Fan, Wentao
    Shangguan, Weimin
    Chen, Yewang
    INTERNATIONAL JOURNAL OF MACHINE LEARNING AND CYBERNETICS, 2023, 14 (10) : 3413 - 3426
  • [4] Transformer-based contrastive learning framework for image anomaly detection
    Wentao Fan
    Weimin Shangguan
    Yewang Chen
    International Journal of Machine Learning and Cybernetics, 2023, 14 : 3413 - 3426
  • [5] Transformer-Based Parameter Estimation in Statistics
    Yin, Xiaoxin
    Yin, David S.
    MATHEMATICS, 2024, 12 (07)
  • [6] Learning Daily Human Mobility with a Transformer-Based Model
    Wang, Weiying
    Osaragi, Toshihiro
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2024, 13 (02)
  • [7] Transformer-based deep learning model for forced oscillation localization
    Matar, Mustafa
    Estevez, Pablo Gill
    Marchi, Pablo
    Messina, Francisco
    Elmoudi, Ramadan
    Wshah, Safwan
    INTERNATIONAL JOURNAL OF ELECTRICAL POWER & ENERGY SYSTEMS, 2023, 146
  • [8] Characterization of groundwater contamination: A transformer-based deep learning model
    Bai, Tao
    Tahmasebi, Pejman
    ADVANCES IN WATER RESOURCES, 2022, 164
  • [9] GIT: A Transformer-Based Deep Learning Model for Geoacoustic Inversion
    Feng, Sheng
    Zhu, Xiaoqian
    Ma, Shuqing
    Lan, Qiang
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2023, 11 (06)
  • [10] Transformer-based Reinforcement Learning Model for Optimized Quantitative Trading
    Kumar, Aniket
    Rizk, Rodrigue
    Santosh, K. C.
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 1454 - 1455