Small-Sample Seabed Sediment Classification Based on Deep Learning

被引:3
|
作者
Zhao, Yuxin [1 ,2 ]
Zhu, Kexin [1 ,2 ]
Zhao, Ting [3 ]
Zheng, Liangfeng [1 ,2 ]
Deng, Xiong [1 ,2 ]
机构
[1] Harbin Engn Univ, Coll Intelligent Syst Sci & Engn, Harbin 150001, Peoples R China
[2] Minist Educ, Engn Res Ctr Nav Instruments, Harbin 150001, Peoples R China
[3] Harbin Engn Univ, Coll Underwater Acoust Engn, Harbin 150001, Peoples R China
关键词
acoustic remote sensing; seabed sediment classification; small-sample; side-scan sonar; self-attention generative adversarial network; self-attention densely connected convolutional network; SCAN SONAR IMAGES; RECOGNITION; MODEL;
D O I
10.3390/rs15082178
中图分类号
X [环境科学、安全科学];
学科分类号
08 ; 0830 ;
摘要
Seabed sediment classification is of great significance in acoustic remote sensing. To accurately classify seabed sediments, big data are needed to train the classifier. However, acquiring seabed sediment information is expensive and time-consuming, which makes it crucial to design a well-performing classifier using small-sample seabed sediment data. To avoid data shortage, a self-attention generative adversarial network (SAGAN) was trained for data augmentation in this study. SAGAN consists of a generator, which generates data similar to the real image, and a discriminator, which distinguishes whether the image is real or generated. Furthermore, a new classifier for seabed sediment based on self-attention densely connected convolutional network (SADenseNet) is proposed to improve the classification accuracy of seabed sediment. The SADenseNet was trained using augmented images to improve the classification performance. The self-attention mechanism can scan the global image to obtain global features of the sediment image and is able to highlight key regions, improving the efficiency and accuracy of visual information processing. The proposed SADenseNet trained with the augmented dataset had the best performance, with classification accuracies of 92.31%, 95.72%, 97.85%, and 95.28% for rock, sand, mud, and overall, respectively, with a kappa coefficient of 0.934. The twelve classifiers trained with the augmented dataset improved the classification accuracy by 2.25%, 5.12%, 0.97%, and 2.64% for rock, sand, mud, and overall, respectively, and the kappa coefficient by 0.041 compared to the original dataset. In this study, SAGAN can enrich the features of the data, which makes the trained classification networks have better generalization. Compared with the state-of-the-art classifiers, the proposed SADenseNet has better classification performance.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Radar Moving Target Detection Based on Small-Sample Transfer Learning and Attention Mechanism
    Zhu, Jiang
    Wen, Cai
    Duan, Chongdi
    Wang, Weiwei
    Yang, Xiaochao
    REMOTE SENSING, 2024, 16 (22)
  • [42] Small-sample cucumber disease identification based on multimodal self-supervised learning
    Cao, Yiyi
    Sun, Guangling
    Yuan, Yuan
    Chen, Lei
    CROP PROTECTION, 2025, 188
  • [43] Image-Text Dual Model for Small-Sample Image Classification
    Zhu, Fangyi
    Li, Xiaoxu
    Ma, Zhanyu
    Chen, Guang
    Peng, Pai
    Guo, Xiaowei
    Chien, Jen-Tzung
    Guo, Jun
    COMPUTER VISION, PT II, 2017, 772 : 556 - 565
  • [44] SMALL-SAMPLE CLASSIFICATION OF HYPERSPECTRAL DATA IN A GRAPH-BASED SEMI-SUPERVISION FRAMWORK
    Zhang, Chunmei
    Wang, Junyan
    Zhang, Yunbin
    Liu, Yaoyao
    2017 IEEE INTERNATIONAL GEOSCIENCE AND REMOTE SENSING SYMPOSIUM (IGARSS), 2017, : 3194 - 3197
  • [45] Is there correlation between the estimated and true classification errors in small-sample settings?
    Ranczar, Blaise
    Hua, B. Jianping
    Dougherty, Edward R.
    2007 IEEE/SP 14TH WORKSHOP ON STATISTICAL SIGNAL PROCESSING, VOLS 1 AND 2, 2007, : 16 - +
  • [46] A Small-Sample Text Classification Model Based on Pseudo-Label Fusion Clustering Algorithm
    Yang, Linda
    Huang, Baohua
    Guo, Shiqian
    Lin, Yunjie
    Zhao, Tong
    APPLIED SCIENCES-BASEL, 2023, 13 (08):
  • [47] SMALL-SAMPLE INTERVALS FOR REGRESSION
    TINGLEY, MA
    CANADIAN JOURNAL OF STATISTICS-REVUE CANADIENNE DE STATISTIQUE, 1992, 20 (03): : 271 - 280
  • [48] HCNM: Hierarchical cognitive neural model for small-sample image classification
    Jin, Dequan
    Li, Ruoge
    Xiang, Nan
    Zhao, Di
    Xiang, Xuanlu
    Ying, Shihui
    EXPERT SYSTEMS WITH APPLICATIONS, 2025, 276
  • [49] Student and small-sample theory
    Lehmann, EL
    STATISTICAL SCIENCE, 1999, 14 (04) : 418 - 426
  • [50] Research on the Deep Learning of the Small Sample Data based on Transfer Learning
    Zhao, Wei
    GREEN ENERGY AND SUSTAINABLE DEVELOPMENT I, 2017, 1864