Sentinel SAR-optical fusion for crop type mapping using deep learning and Google Earth Engine

被引:117
|
作者
Adrian, Jarrett [1 ,2 ]
Sagan, Vasit [1 ,2 ]
Maimaitijiang, Maitiniyazi [1 ,2 ]
机构
[1] St Louis Univ, Geospatial Inst, 3694 West Pine Mall, St Louis, MO 63108 USA
[2] St Louis Univ, Dept Earth & Atmospher Sci, 3642 Lindell Blvd, St Louis, MO 63108 USA
基金
美国国家科学基金会; 美国国家航空航天局;
关键词
3D U-Net; Denoising neural networks; Sentinel-1; Sentinel-2; Data fusion; INSTANCE SEGMENTATION; LAND-COVER; CLASSIFICATION; RAPESEED; NETWORK;
D O I
10.1016/j.isprsjprs.2021.02.018
中图分类号
P9 [自然地理学];
学科分类号
0705 ; 070501 ;
摘要
Accurate crop type mapping provides numerous benefits for a deeper understanding of food systems and yield prediction. Ever-increasing big data, easy access to high-resolution imagery, and cloud-based analytics platforms like Google Earth Engine have drastically improved the ability for scientists to advance data-driven agriculture with improved algorithms for crop type mapping using remote sensing, computer vision, and machine learning. Crop type mapping techniques mainly relied on standalone SAR and optical imagery, few studies investigated the potential of SAR-optical data fusion, coupled with virtual constellation, and 3-dimensional (3D) deep learning networks. To this extent, we use a deep learning approach that utilizes the denoised backscatter and texture information from multi-temporal Sentinel-1 SAR data and the spectral information from multi-temporal optical Sentinel-2 data for mapping ten different crop types, as well as water, soil and urban area. Multi-temporal Sentinel-1 data was fused with multi-temporal optical Sentinel-2 data in an effort to improve classification accuracies for crop types. We compared the results of the 3D U-Net to the state-of-the-art deep learning networks, including SegNet and 2D U-Net, as well as commonly used machine learning method such as Random Forest. The results showed (1) fusing multi-temporal SAR and optical data yields higher training overall accuracies (OA) (3D U-Net 0.992, 2D U-Net 0.943, SegNet 0.871) and testing OA (3D U-Net 0.941, 2D U-Net 0.847, SegNet 0.643) for crop type mapping compared to standalone multi-temporal SAR or optical data (2) optical data fused with denoised SAR data via a denoising convolution neural network (OA 0.912) performed better for crop type mapping compared to optical data fused with boxcar (OA 0.880), Lee (OA 0.881), and median (OA 0.887) filtered SAR data and (3) 3D convolutional neural networks perform better than 2D convolutional neural networks for crop type mapping (SAR OA 0.912, optical OA 0.937, fused OA 0.992).
引用
下载
收藏
页码:215 / 235
页数:21
相关论文
共 50 条
  • [1] Sentinel SAR-optical fusion for improving in-season wheat crop mapping at a large scale using machine learning and the Google Earth engine platform
    Zoungrana, Louis Evence
    Barbouchi, Meriem
    Toukabri, Wael
    Babasy, Mohamedou Ould
    Khatra, Nabil Ben
    Annabi, Mohamed
    Bahri, Haithem
    APPLIED GEOMATICS, 2024, 16 (01) : 147 - 160
  • [2] Sentinel SAR-optical fusion for improving in-season wheat crop mapping at a large scale using machine learning and the Google Earth engine platform
    Louis Evence Zoungrana
    Meriem Barbouchi
    Wael Toukabri
    Mohamedou Ould Babasy
    Nabil Ben Khatra
    Mohamed Annabi
    Haithem Bahri
    Applied Geomatics, 2024, 16 : 147 - 160
  • [3] Deep learning based crop-type mapping using SAR and optical data fusion
    Hamidi, Masoumeh
    Homayouni, Saeid
    Safari, Abdolreza
    Hasani, Hadiseh
    INTERNATIONAL JOURNAL OF APPLIED EARTH OBSERVATION AND GEOINFORMATION, 2024, 129
  • [4] Continental-scale mapping of soil pH with SAR-optical fusion based on long-term earth observation data in google earth engine
    Geng, Yajun
    Zhou, Tao
    Zhang, Zhenhua
    Cui, Buli
    Sun, Junna
    Zeng, Lin
    Yang, Runya
    Wu, Nan
    Liu, Tingting
    Pan, Jianjun
    Si, Bingcheng
    Lausch, Angela
    ECOLOGICAL INDICATORS, 2024, 165
  • [5] MAPPING CALIFORNIA RICE USING OPTICAL AND SAR DATA FUSION WITH PHENOLOGICAL FEATURES IN GOOGLE EARTH ENGINE
    Li, Wenzhao
    El-Askary, Hesham
    Struppa, Daniele C.
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 5619 - 5622
  • [6] INVESTIGATING SAR-OPTICAL DEEP LEARNING DATA FUSION TO MAP THE BRAZILIAN CERRADO VEGETATION WITH SENTINEL DATA
    Silva Filho, Paulo
    Persello, Claudio
    Maretto, Raian V.
    Machado, Renato
    IGARSS 2023 - 2023 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2023, : 1365 - 1368
  • [7] Google Earth Engine app using Sentinel 1 SAR and deep learning for ocean seep methane detection and monitoring
    Hernandez-Hamon, Hernando
    Ramirez, Paula Zapata
    Zaraza, Maycol
    Micallef, Aaron
    REMOTE SENSING APPLICATIONS-SOCIETY AND ENVIRONMENT, 2023, 32
  • [8] Rice crop growth monitoring with sentinel 1 SAR data using machine learning models in google earth engine cloud
    Singha, Chiranjit
    Swain, Kishore Chandra
    REMOTE SENSING APPLICATIONS-SOCIETY AND ENVIRONMENT, 2023, 32
  • [9] Mapping Coastal Aquaculture Ponds of China Using Sentinel SAR Images in 2020 and Google Earth Engine
    Tian, Peng
    Liu, Yongchao
    Li, Jialin
    Pu, Ruiliang
    Cao, Luodan
    Zhang, Haitao
    Ai, Shunyi
    Yang, Yunze
    REMOTE SENSING, 2022, 14 (21)
  • [10] DEEP LEARNING FOR SAR-OPTICAL IMAGE MATCHING
    Hughes, Lloyd Haydn
    Merkle, Nina
    Buergmann, Tatjana
    Auer, Stefan
    Schmitt, Michael
    2019 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS 2019), 2019, : 4877 - 4880