Wetland Classification Using Deep Convolutional Neural Network

被引:0
|
作者
Mandianpari, Masoud [1 ,2 ]
Rezaee, Mohammad [3 ]
Zhang, Yun [3 ]
Salehi, Bahram [1 ,2 ]
机构
[1] Mem Univ Newfoundland, C CORE, St John, NF A1B 3X5, Canada
[2] Mem Univ Newfoundland, Dept Elect Engn, St John, NF A1B 3X5, Canada
[3] Univ New Brunswick, Dept Geodesy & Geomat Engn, Lab Adv Geomat Image Proc, CRC, Fredericton, NB E3B 5A3, Canada
来源
IGARSS 2018 - 2018 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM | 2018年
关键词
Convolutional Neural Network; high-level features; AlexNet; Random Forest; Machine Learning; wetland mapping;
D O I
暂无
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
The synergistic use of spatial features with spectral properties of satellite images enhances thematic land cover information. This study aims to address the lack of high-level features by proposing a classification framework based on convolutional neural network (CNN) to learn deep spatial features for wetland. In particular, a CNN model was used for classification of remote sensing imagery with limited number of training data by fine-tuning of a preexisting CNN (AlexNet). The classification results obtained by the deep CNN were compared with those based on well-known ensemble classifiers, namely Random Forest (RF), to evaluate the efficiency of CNN. Experimental results demonstrated that CNN was superior to RF for complex wetland mapping even by incorporating the small number of input features (i.e., 3 features) for CNN compared to RF. The proposed classification scheme serves as a baseline framework to facilitate further scientific research using the latest state-of-art machine learning tools for processing remote sensing data.
引用
收藏
页码:9249 / 9252
页数:4
相关论文
共 50 条
  • [11] The skin cancer classification using deep convolutional neural network
    Dorj, Ulzii-Orshikh
    Lee, Keun-Kwang
    Choi, Jae-Young
    Lee, Malrey
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (08) : 9909 - 9924
  • [12] The skin cancer classification using deep convolutional neural network
    Ulzii-Orshikh Dorj
    Keun-Kwang Lee
    Jae-Young Choi
    Malrey Lee
    Multimedia Tools and Applications, 2018, 77 : 9909 - 9924
  • [13] Dari Speech Classification Using Deep Convolutional Neural Network
    Dawodi, Mursal
    Baktash, Jawid Ahamd
    Wada, Tomohisa
    Alam, Najwa
    Joya, Mohammad Zarif
    2020 IEEE INTERNATIONAL IOT, ELECTRONICS AND MECHATRONICS CONFERENCE (IEMTRONICS 2020), 2020, : 110 - 113
  • [14] Plant species classification using deep convolutional neural network
    Dyrmann, Mads
    Karstoft, Henrik
    Midtiby, Henrik Skov
    BIOSYSTEMS ENGINEERING, 2016, 151 : 72 - 80
  • [15] Wetland classification method using fully convolutional neural network and Stacking algorithm
    Zhang M.
    Lin H.
    Long X.
    Lin, Hui (linhui@csuft.edu.cn), 1600, Chinese Society of Agricultural Engineering (36): : 257 - 264
  • [16] Sound Classification Using Convolutional Neural Network and Tensor Deep Stacking Network
    Khamparia, Aditya
    Gupta, Deepak
    Nhu Gia Nguyen
    Khanna, Ashish
    Pandey, Babita
    Tiwari, Prayag
    IEEE ACCESS, 2019, 7 : 7717 - 7727
  • [17] Breeds Classification with Deep Convolutional Neural Network
    Zhang, Yicheng
    Gao, Jipeng
    Zhou, Haolin
    ICMLC 2020: 2020 12TH INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND COMPUTING, 2018, : 145 - 151
  • [18] CONVOLUTIONAL NEURAL NETWORK FOR COASTAL WETLAND CLASSIFICATION IN HYPERSPECTRAL IMAGE
    Liu, Chang
    Zhang, Mengmeng
    Li, Wei
    Sun, Weiwei
    Tao, Ran
    IGARSS 2020 - 2020 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM, 2020, : 5104 - 5107
  • [19] Extracting Wetland Type Information with a Deep Convolutional Neural Network
    Guan, XianMing
    Wang, Di
    Wan, Luhe
    Zhang, Jiyi
    COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2022, 2022
  • [20] Event Detection and Classification Using Deep Compressed Convolutional Neural Network
    Swapnika, K.
    Vasumathi, D.
    INTERNATIONAL JOURNAL OF ADVANCED COMPUTER SCIENCE AND APPLICATIONS, 2022, 13 (12) : 312 - 322