Dynamic and Adaptive Self-Training for Semi-Supervised Remote Sensing Image Semantic Segmentation

被引:0
|
作者
Jin, Jidong [1 ,2 ,3 ,4 ]
Lu, Wanxuan [1 ,2 ]
Yu, Hongfeng [1 ,2 ]
Rong, Xuee [1 ,2 ,3 ,4 ]
Sun, Xian [1 ,2 ,3 ,4 ]
Wu, Yirong [1 ,2 ,3 ,4 ]
机构
[1] Chinese Acad Sci, Aerosp Informat Res Inst, Inst Elect, Beijing 100190, Peoples R China
[2] Chinese Acad Sci, Inst Elect, Key Lab Network Informat Syst Technol NIST, Beijing 100190, Peoples R China
[3] Univ Chinese Acad Sci, Beijing 100190, Peoples R China
[4] Univ Chinese Acad Sci, Sch Elect Elect & Commun Engn, Beijing 100190, Peoples R China
基金
中国国家自然科学基金;
关键词
Remote sensing; Semantic segmentation; Transformers; Data models; Training; Semantics; Predictive models; Consistency regularization (CR); remote sensing (RS) image; self-training; semantic segmentation; semisupervised learning (SSL);
D O I
10.1109/TGRS.2024.3407142
中图分类号
P3 [地球物理学]; P59 [地球化学];
学科分类号
0708 ; 070902 ;
摘要
Remote sensing (RS) technology has made remarkable progress, providing a wealth of data for various applications, such as ecological conservation and urban planning. However, the meticulous annotation of this data is labor-intensive, leading to a shortage of labeled data, particularly in tasks like semantic segmentation. Semi-supervised methods, combining consistency regularization (CR) with self-training, offer a solution to efficiently utilize labeled and unlabeled data. However, these methods encounter challenges due to imbalanced data ratios. To tackle these challenges, we introduce a self-training approach named dynamic and adaptive self-training (DAST), which is combined with dynamic pseudo-label sampling (DPS), distribution matching (DM), and adaptive threshold updating (ATU). DPS is tailored to address the issue of class distribution imbalance by giving priority to classes with fewer samples. Meanwhile, DM and ATU aim to reduce distribution disparities by adjusting model predictions across augmented images within the framework of CR, ensuring they align with the actual data distribution. Experimental results on the Potsdam and iSAID datasets demonstrate that DAST effectively balances class distribution, aligns model predictions with data distribution, and stabilizes pseudo-labels, leading to state-of-the-art performance on both datasets. These findings highlight the potential of DAST in overcoming the challenges associated with significant disparities in labeled-to-unlabeled data ratios.
引用
下载
收藏
页码:1 / 1
页数:14
相关论文
共 50 条
  • [41] Adaptive Adversarial Self-Training for Semi-Supervised Object Detection in Complex Maritime Scenes
    Feng, Junjian
    Tian, Lianfang
    Li, Xiangxia
    MATHEMATICS, 2024, 12 (15)
  • [42] Weakly-Supervised Semantic Segmentation via Self-training
    Cheng, Hao
    Gu, Chaochen
    Wu, Kaijie
    2020 4TH INTERNATIONAL CONFERENCE ON CONTROL ENGINEERING AND ARTIFICIAL INTELLIGENCE (CCEAI 2020), 2020, 1487
  • [43] STCRNet: A Semi-Supervised Network Based on Self-Training and Consistency Regularization for Change Detection in VHR Remote Sensing Images
    Wang, Lukang
    Zhang, Min
    Shi, Wenzhong
    IEEE JOURNAL OF SELECTED TOPICS IN APPLIED EARTH OBSERVATIONS AND REMOTE SENSING, 2024, 17 : 2272 - 2282
  • [44] Semi-Supervised Hyperspectral Image Classification via Spatial-Regulated Self-Training
    Wu, Yue
    Mu, Guifeng
    Qin, Can
    Miao, Qiguang
    Ma, Wenping
    Zhang, Xiangrong
    REMOTE SENSING, 2020, 12 (01)
  • [45] Semi-supervised learning with ensemble self-training for cancer classification
    Wang, Qingyong
    Xia, Liang-Yong
    Chai, Hua
    Zhou, Yun
    2018 IEEE SMARTWORLD, UBIQUITOUS INTELLIGENCE & COMPUTING, ADVANCED & TRUSTED COMPUTING, SCALABLE COMPUTING & COMMUNICATIONS, CLOUD & BIG DATA COMPUTING, INTERNET OF PEOPLE AND SMART CITY INNOVATION (SMARTWORLD/SCALCOM/UIC/ATC/CBDCOM/IOP/SCI), 2018, : 796 - 803
  • [46] Self-Training using Selection Network for Semi-supervised Learning
    Jeong, Jisoo
    Lee, Seungeui
    Kwak, Nojun
    ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2020, : 23 - 32
  • [47] An Auto-Adjustable Semi-Supervised Self-Training Algorithm
    Livieris, Ioannis E.
    Kanavos, Andreas
    Tampakas, Vassilis
    Pintelas, Panagiotis
    ALGORITHMS, 2018, 11 (09):
  • [48] Semi-Supervised Meta-Learning via Self-Training
    Zhou, Meng
    Li, Yaoyi
    Lu, Hongtao
    Cai Nengbin
    Zhao Xuejun
    2020 THE 3RD INTERNATIONAL CONFERENCE ON INTELLIGENT AUTONOMOUS SYSTEMS (ICOIAS'2020), 2020, : 1 - 7
  • [49] Improving semi-supervised self-training with embedded manifold transduction
    Tao, Ye
    Zhang, Duzhou
    Cheng, Shengjun
    Tang, Xianglong
    TRANSACTIONS OF THE INSTITUTE OF MEASUREMENT AND CONTROL, 2018, 40 (02) : 363 - 374
  • [50] Semi-supervised classification algorithm for hyperspectral remote sensing image based on DE-self-training
    Key Laboratory for Virtual Geographic Environment, Ministry of Education, Nanjing Normal University, Nanjing
    210023, China
    不详
    210023, China
    不详
    210008, China
    Nongye Jixie Xuebao, 5 (239-244):