Self-supervised learning for climate downscaling

被引:1
|
作者
Singh, Karandeep [1 ]
Jeong, Chaeyoon [1 ,2 ]
Park, Sungwon [1 ,2 ]
Babur, Arjun N. [3 ,4 ]
Zeller, Elke [3 ,4 ]
Cha, Meeyoung [1 ,2 ]
机构
[1] Inst for Basic Sci Korea, Data Sci Grp, Daejeon, South Korea
[2] Korea Adv Inst Sci & Technol, Sch Comp, Daejeon, South Korea
[3] IBS, Ctr Climate Phys, Busan, South Korea
[4] PNU, Dept Climate Syst, Busan, South Korea
关键词
Earth system models; Climate simulation; Super-resolution; Self-supervised learning;
D O I
10.1109/BigComp57234.2023.00012
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Earth system models (ESM) are computer models that quantitatively simulate the Earth's climate system. These models are the basis of modern research on climate change and its effects on our planet. Advances in computational technologies and simulation methodologies have enabled ESM to produce simulation outputs at a finer level of detail, which is important for policy planning and research at the regional level. As ESM is a complex incorporation of different physical domains and environmental variables, computational costs for conducting simulations at a finer resolution are prohibitively expensive. In practice, the simulation at the coarser level is mapped onto the regional level by the process of "downscaling". In this presents a self-supervised deep-learning solution for climate downscaling that does not require high-resolution ground truth data during the model training process. We introduce a self-supervised convolutional neural network (CNN) super-resolution model that trains on a single data instance at a time and can adapt to its underlying data patterns at runtime. Experimental results demonstrate that the proposed model consistently improves the climate downscaling performance over the widely used baselines by a large margin.
引用
收藏
页码:13 / 17
页数:5
相关论文
共 50 条
  • [1] Self-Supervised Vision for Climate Downscaling
    Singh, Karandeep
    Jeong, Chaeyoon
    Shidqi, Naufal
    Park, Sungwon
    Nellikkatti, Arjun
    Zeller, Elke
    Cha, Meeyoung
    PROCEEDINGS OF THE THIRTY-THIRD INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2024, 2024, : 7456 - 7464
  • [2] Gated Self-supervised Learning for Improving Supervised Learning
    Fuadi, Erland Hillman
    Ruslim, Aristo Renaldo
    Wardhana, Putu Wahyu Kusuma
    Yudistira, Novanto
    2024 IEEE CONFERENCE ON ARTIFICIAL INTELLIGENCE, CAI 2024, 2024, : 611 - 615
  • [3] Self-Supervised Dialogue Learning
    Wu, Jiawei
    Wang, Xin
    Wang, William Yang
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 3857 - 3867
  • [4] Longitudinal self-supervised learning
    Zhao, Qingyu
    Liu, Zixuan
    Adeli, Ehsan
    Pohl, Kilian M.
    MEDICAL IMAGE ANALYSIS, 2021, 71
  • [5] Self-supervised learning model
    Saga, Kazushie
    Sugasaka, Tamami
    Sekiguchi, Minoru
    Fujitsu Scientific and Technical Journal, 1993, 29 (03): : 209 - 216
  • [6] Credal Self-Supervised Learning
    Lienen, Julian
    Huellermeier, Eyke
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Self-Supervised Learning for Recommendation
    Huang, Chao
    Xia, Lianghao
    Wang, Xiang
    He, Xiangnan
    Yin, Dawei
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON INFORMATION AND KNOWLEDGE MANAGEMENT, CIKM 2022, 2022, : 5136 - 5139
  • [8] Quantum self-supervised learning
    Jaderberg, B.
    Anderson, L. W.
    Xie, W.
    Albanie, S.
    Kiffner, M.
    Jaksch, D.
    QUANTUM SCIENCE AND TECHNOLOGY, 2022, 7 (03):
  • [9] Self-Supervised Learning for Electroencephalography
    Rafiei, Mohammad H.
    Gauthier, Lynne V.
    Adeli, Hojjat
    Takabi, Daniel
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2024, 35 (02) : 1457 - 1471
  • [10] A New Self-supervised Method for Supervised Learning
    Yang, Yuhang
    Ding, Zilin
    Cheng, Xuan
    Wang, Xiaomin
    Liu, Ming
    INTERNATIONAL CONFERENCE ON COMPUTER VISION, APPLICATION, AND DESIGN (CVAD 2021), 2021, 12155