Self-Supervised Pretraining of Transformers for Satellite Image Time Series Classification

被引:91
|
作者
Yuan, Yuan [1 ]
Lin, Lei [2 ]
机构
[1] Nanjing Univ Posts & Telecommun, Sch Geog & Biol Informat, Nanjing 210023, Peoples R China
[2] Beijing Qihoo Technol Co Ltd, Beijing 100015, Peoples R China
基金
中国国家自然科学基金;
关键词
Bidirectional encoder representations from Transformers (BERT); classification; satellite image time series (SITS); self-supervised learning; transfer learning; unsupervised pretraining; LAND-COVER CLASSIFICATION; CROP CLASSIFICATION; REPRESENTATION;
D O I
10.1109/JSTARS.2020.3036602
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Satellite image time series (SITS) classification is a major research topic in remote sensing and is relevant for a wide range of applications. Deep learning approaches have been commonly employed for the SITS classification and have provided state-of-the-art performance. However, deep learning methods suffer from overfitting when labeled data are scarce. To address this problem, we propose a novel self-supervised pretraining scheme to initialize a transformer-based network by utilizing large-scale unlabeled data. In detail, the model is asked to predict randomly contaminated observations given an entire time series of a pixel. The main idea of our proposal is to leverage the inherent temporal structure of satellite time series to learn general-purpose spectral-temporal representations related to land cover semantics. Once pretraining is completed, the pretrained network can be further adapted to various SITS classification tasks by fine-tuning all the model parameters on small-scale task-related labeled data. In this way, the general knowledge and representations about SITS can be transferred to a label-scarce task, thereby improving the generalization performance of the model as well as reducing the risk of overfitting. Comprehensive experiments have been carried out on three benchmark datasets over large study areas. Experimental results demonstrate the effectiveness of the proposed pretraining scheme, leading to substantial improvements in classification accuracy using transformer, 1-D convolutional neural network, and bidirectional long short-term memory network. The code and the pretrained model will be available at https://github.com/linlei1214/SITS-BERT upon publication.
引用
收藏
页码:474 / 487
页数:14
相关论文
共 50 条
  • [1] Progressive Self-Supervised Pretraining for Hyperspectral Image Classification
    Guan, Peiyan
    Lam, Edmund Y.
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 13
  • [2] Self-Supervised Pretraining Improves Self-Supervised Pretraining
    Reed, Colorado J.
    Yue, Xiangyu
    Nrusimha, Ani
    Ebrahimi, Sayna
    Vijaykumar, Vivek
    Mao, Richard
    Li, Bo
    Zhang, Shanghang
    Guillory, Devin
    Metzger, Sean
    Keutzer, Kurt
    Darrell, Trevor
    2022 IEEE WINTER CONFERENCE ON APPLICATIONS OF COMPUTER VISION (WACV 2022), 2022, : 1050 - 1060
  • [3] FactoFormer: Factorized Hyperspectral Transformers With Self-Supervised Pretraining
    Mohamed, Shaheer
    Haghighat, Maryam
    Fernando, Tharindu
    Sridharan, Sridha
    Fookes, Clinton
    Moghadam, Peyman
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2024, 62 : 1 - 14
  • [4] Self-supervised transformers for turbulent flow time series
    Drikakis, Dimitris
    Kokkinakis, Ioannis William
    Fung, Daryl
    Spottswood, S. Michael
    PHYSICS OF FLUIDS, 2024, 36 (06)
  • [5] SELF-SUPERVISED SPATIO-TEMPORAL REPRESENTATION LEARNING OF SATELLITE IMAGE TIME SERIES
    Dumeur, Iris
    Valero, Silvia
    Inglada, Jordi
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 642 - 645
  • [6] Self-Supervised Spatio-Temporal Representation Learning of Satellite Image Time Series
    Dumeur, Iris
    Valero, Silvia
    Inglada, Jordi
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 (4350-4367) : 4350 - 4367
  • [7] Improving Medical Image Classification in Noisy Labels Using only Self-supervised Pretraining
    Khanal, Bidur
    Bhattarai, Binod
    Khanal, Bishesh
    Linte, Cristian A.
    DATA ENGINEERING IN MEDICAL IMAGING, DEMI 2023, 2023, 14314 : 78 - 90
  • [8] Self-supervised Learning for Semi-supervised Time Series Classification
    Jawed, Shayan
    Grabocka, Josif
    Schmidt-Thieme, Lars
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2020, PT I, 2020, 12084 : 499 - 511
  • [9] Self-Supervised Pre-training for Time Series Classification
    Shi, Pengxiang
    Ye, Wenwen
    Qin, Zheng
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [10] InverseTime: A Self-Supervised Technique for Semi-Supervised Classification of Time Series
    Goyo, Manuel Alejandro
    Nanculef, Ricardo
    Valle, Carlos
    IEEE Access, 2024, 12 : 165081 - 165093