An unsupervised domain adaptation deep learning method for spatial and temporal transferable crop type mapping using Sentinel-2 imagery

被引:29
|
作者
Wang, Yumiao [1 ,2 ]
Feng, Luwei [3 ]
Zhang, Zhou [4 ]
Tian, Feng [3 ,5 ]
机构
[1] Ningbo Univ, Dept Geog & Spatial Informat Tech, Ningbo 315211, Peoples R China
[2] Minist Nat Resources, Key Lab Urban Land Resources Monitoring & Simulat, Shenzhen 518034, Peoples R China
[3] Wuhan Univ, Sch Remote Sensing & Informat Engn, Wuhan 430079, Peoples R China
[4] Univ Wisconsin Madison, Dept Biol Syst Engn, Madison, WI 53706 USA
[5] Hubei Luojia Lab, Wuhan 430079, Peoples R China
基金
中国国家自然科学基金;
关键词
Crop type mapping; Unsupervised domain adaptation; Time -series imagery; Transfer learning; RED-EDGE BANDS; LAND-COVER; CLASSIFICATION; CHLOROPHYLL; NDVI;
D O I
10.1016/j.isprsjprs.2023.04.002
中图分类号
P9 [自然地理学];
学科分类号
0705 ; 070501 ;
摘要
Accurate crop type mapping is essential for crop growth monitoring and yield estimation. Recently, various machine learning methods have been increasingly used for crop type mapping, but they often lose their validity when directly applied to other regions and years due to differences in the distribution of source and target data, that is, domain shift. To address the problem, we developed a deep adaptation crop classification network (DACCN) based on the idea of unsupervised domain adaptation (UDA). The proposed DACCN mainly consists of two parts, a feature extractor that converts the original input into high-level representations, and a domain aligner where the discrepancy between source and target distributions is measured using multiple kernel variant of maximum mean discrepancy (MK-MMD). Four states in the United States (U.S.) Corn Belt and two provinces in northeastern China were used as study areas, where samples used for model building and accuracy evaluation were collected based on time-series Sentinel-2 imagery and reference maps in 2018 and 2019. Then, three experiments were designed to verify the transferability of DACCN across space, time, and space-time, respectively. In each experiment, the proposed DACCN was compared to deep crop classification network (DCCN), a model with a similar structure to DACCN but without the domain adaptation mechanism, and two machine learning methods, random forest (RF) and support vector machines (SVM). The experimental results showed that DACCN outperformed other models in most transfer cases with overall classification accuracies ranging from 0.835 to 0.922. The DACCN also performed better in spatially continuous mapping with its predicted crop type maps more consistent with the reference ones. As an innovative application of transfer learning in crop type mapping, the methodology proposed in this study effectively addressed the problem of missing labels in target domains and alleviated the negative impact of domain shift.
引用
收藏
页码:102 / 117
页数:16
相关论文
共 50 条
  • [1] Crop type mapping using LiDAR, Sentinel-2 and aerial imagery with machine learning algorithms
    Prins, Adriaan Jacobus
    Van Niekerk, Adriaan
    [J]. GEO-SPATIAL INFORMATION SCIENCE, 2021, 24 (02) : 215 - 227
  • [2] Exploring the potential of multi-source unsupervised domain adaptation in crop mapping using Sentinel-2 images
    Wang, Yumiao
    Feng, Luwei
    Sun, Weiwei
    Zhang, Zhou
    Zhang, Hanyu
    Yang, Gang
    Meng, Xiangchao
    [J]. GISCIENCE & REMOTE SENSING, 2022, 59 (01) : 2247 - 2265
  • [3] A temporal-spatial deep learning network for winter wheat mapping using time-series Sentinel-2 imagery
    Fan, Lingling
    Xia, Lang
    Yang, Jing
    Sun, Xiao
    Wu, Shangrong
    Qiu, Bingwen
    Chen, Jin
    Wu, Wenbin
    Yang, Peng
    [J]. ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2024, 214 : 48 - 64
  • [4] RAPID MAPPING OF LANDSLIDES FROM SENTINEL-2 DATA USING UNSUPERVISED DEEP LEARNING
    Shahabi, H.
    Rahimzad, M.
    Ghorbanzadeh, O.
    Piralilou, S. T.
    Blaschke, T.
    Homayouni, S.
    Ghamisi, P.
    [J]. 2022 IEEE MEDITERRANEAN AND MIDDLE-EAST GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (M2GARSS), 2022, : 17 - 20
  • [5] Burnt Forest Estimation from Sentinel-2 Imagery of Australia using Unsupervised Deep Learning
    Abid, Nosheen
    Malik, Muhammad Imran
    Shahzad, Muhammad
    Shafait, Faisal
    Ali, Haider
    Ghaffar, Muhammad Mohsin
    Weis, Christian
    Wehn, Norbert
    Liwicki, Marcus
    [J]. 2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 74 - 81
  • [6] Unsupervised Deep Learning for Landslide Detection from Multispectral Sentinel-2 Imagery
    Shahabi, Hejar
    Rahimzad, Maryam
    Piralilou, Sepideh Tavakkoli
    Ghorbanzadeh, Omid
    Homayouni, Saied
    Blaschke, Thomas
    Lim, Samsung
    Ghamisi, Pedram
    [J]. REMOTE SENSING, 2021, 13 (22)
  • [7] A deep learning approach for automatic mapping of poplar plantations using Sentinel-2 imagery
    D'Amico, G.
    Francini, S.
    Giannetti, F.
    Vangi, E.
    Travaglini, D.
    Chianucci, F.
    Mattioli, W.
    Grotti, M.
    Puletti, N.
    Corona, P.
    Chirici, G.
    [J]. GISCIENCE & REMOTE SENSING, 2021, 58 (08) : 1352 - 1368
  • [8] CROP TYPE MAPPING USING MULTI-DATE IMAGERY FROM THE SENTINEL-2 SATELLITES
    Gikov, Alexander
    Dimitrov, Petar
    Filchev, Lachezar
    Roumenina, Eugenia
    Jelev, Georgi
    [J]. COMPTES RENDUS DE L ACADEMIE BULGARE DES SCIENCES, 2019, 72 (06): : 787 - 795
  • [9] A spectral-temporal constrained deep learning method for tree species mapping of plantation forests using time series Sentinel-2 imagery
    Huang, Zehua
    Zhong, Liheng
    Zhao, Feng
    Wu, Jin
    Tang, Hao
    Lv, Zhengang
    Xu, Binyuan
    Zhou, Longfei
    Sun, Rui
    Meng, Ran
    [J]. ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, 2023, 204 : 397 - 420
  • [10] Early Identification of Crop Type for Smallholder Farming Systems Using Deep Learning on Time-Series Sentinel-2 Imagery
    Khan, Haseeb Rehman
    Gillani, Zeeshan
    Jamal, Muhammad Hasan
    Athar, Atifa
    Chaudhry, Muhammad Tayyab
    Chao, Haoyu
    He, Yong
    Chen, Ming
    [J]. SENSORS, 2023, 23 (04)