Generating Views Using Atmospheric Correction for Contrastive Self-Supervised Learning of Multispectral Images

被引:0
|
作者
Patnala, Ankit [1 ]
Stadtler, Scarlet [1 ]
Schultz, Martin G. G. [1 ]
Gall, Juergen [2 ,3 ]
机构
[1] Forschungszentrum Julich, Inst Adv Simulat, D-52428 Julich, Germany
[2] Univ Bonn, Dept Informat Syst & Artificial Intelligence, D-53113 Bonn, Germany
[3] Lamarr Inst Machine Learning & Artificial Intellig, D-44227 Dortmund, Germany
关键词
Atmospheric modeling; Remote sensing; Task analysis; Land surface; Atmospheric measurements; Vegetation mapping; Image color analysis; Contrastive learning; landcover classification; remote sensing; self-supervised learning; transformations;
D O I
10.1109/LGRS.2023.3274493
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
In remote sensing, plenty of multispectral images are publicly available from various landcover satellite missions. Contrastive self-supervised learning is commonly applied to unlabeled data but relies on domain-specific transformations used for learning. When focusing on vegetation, standard transformations from image processing cannot be applied to the near-infrared (NIR) channel, which carries valuable information about the vegetation state. Therefore, we use contrastive learning, relying on different views of unlabeled, multispectral images to obtain a pretrained model to improve the accuracy scores on small-sized remote sensing datasets. This study presents the generation of additional views tailored to remote sensing images using atmospheric correction as an alternative transformation to color jittering. The purpose of the atmospheric transformation is to provide a physically consistent transformation. The proposed transformation can be easily integrated with multiple channels to exploit spectral signatures of objects. Our approach can be applied to other remote sensing tasks. Using this transformation leads to improved classification accuracy of up to 6%.
引用
收藏
页数:5
相关论文
共 50 条
  • [1] Self-supervised contrastive learning on agricultural images
    Guldenring, Ronja
    Nalpantidis, Lazaros
    [J]. COMPUTERS AND ELECTRONICS IN AGRICULTURE, 2021, 191
  • [2] A Review of Predictive and Contrastive Self-supervised Learning for Medical Images
    Wang, Wei-Chien
    Ahn, Euijoon
    Feng, Dagan
    Kim, Jinman
    [J]. MACHINE INTELLIGENCE RESEARCH, 2023, 20 (04) : 483 - 513
  • [3] A Review of Predictive and Contrastive Self-supervised Learning for Medical Images
    Wei-Chien Wang
    Euijoon Ahn
    Dagan Feng
    Jinman Kim
    [J]. Machine Intelligence Research, 2023, 20 : 483 - 513
  • [4] A Survey on Contrastive Self-Supervised Learning
    Jaiswal, Ashish
    Babu, Ashwin Ramesh
    Zadeh, Mohammad Zaki
    Banerjee, Debapriya
    Makedon, Fillia
    [J]. TECHNOLOGIES, 2021, 9 (01)
  • [5] Adversarial Self-Supervised Contrastive Learning
    Kim, Minseon
    Tack, Jihoon
    Hwang, Sung Ju
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS (NEURIPS 2020), 2020, 33
  • [6] Self-Supervised Learning: Generative or Contrastive
    Liu, Xiao
    Zhang, Fanjin
    Hou, Zhenyu
    Mian, Li
    Wang, Zhaoyu
    Zhang, Jing
    Tang, Jie
    [J]. IEEE TRANSACTIONS ON KNOWLEDGE AND DATA ENGINEERING, 2023, 35 (01) : 857 - 876
  • [7] Self-Supervised Triplet Contrastive Learning for Classifying Endometrial Histopathological Images
    Zhao, Fengjun
    Wang, Zhiwei
    Du, Hongyan
    He, Xiaowei
    Cao, Xin
    [J]. IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (12) : 5970 - 5981
  • [8] Contrastive Self-supervised Representation Learning Using Synthetic Data
    She, Dong-Yu
    Xu, Kun
    [J]. INTERNATIONAL JOURNAL OF AUTOMATION AND COMPUTING, 2021, 18 (04) : 556 - 567
  • [9] Contrastive Self-supervised Representation Learning Using Synthetic Data
    Dong-Yu She
    Kun Xu
    [J]. International Journal of Automation and Computing, 2021, 18 (04) : 556 - 567
  • [10] Contrastive Self-supervised Representation Learning Using Synthetic Data
    Dong-Yu She
    Kun Xu
    [J]. International Journal of Automation and Computing, 2021, 18 : 556 - 567