Coastal Aquaculture Area Extraction Based on Self-Attention Mechanism and Auxiliary Loss

被引:10
|
作者
Ai, Bo [1 ]
Xiao, Heng [1 ]
Xu, Hanwen [1 ]
Yuan, Feng [2 ]
Ling, Mengyun
机构
[1] Shandong Univ Sci & Technol, Coll Geodesy & Geomat, Qingdao 266590, Peoples R China
[2] Guangdong Ocean Univ, Guangdong Ocean Dev Planning Res Ctr, Guangzhou 510220, Peoples R China
基金
中国国家自然科学基金;
关键词
Feature extraction; Aquaculture; Satellites; Neural networks; Image segmentation; Data mining; Sea measurements; Aquaculture area extraction; auxiliary loss; self-attention mechanism; deep learning;
D O I
10.1109/JSTARS.2022.3230081
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
With the development of deep learning in satellite remote sensing image segmentation, convolutional neural networks have achieved better results than traditional methods. In some full convolutional networks, the number of network layers usually increases to obtain deep features, but the gradient disappearance problem occurs when the number of network layers deepens. Many scholars have obtained multiscale features by using different convolutional calculations. We want to obtain multiscale features in the network structure while obtaining contextual information by other means. This article employs the self-attention mechanism and auxiliary loss network (SAMALNet) structure to solve the above problems. We adopt the self-attention strategy in the atrous spatial pyramid pooling module to extract multiscale features while considering the contextual information. We add auxiliary loss to overcome the gradient disappearance problem. The experimental results of extracting aquaculture areas in the Jiaozhou Bay area of Qingdao from high-resolution GF-2 satellite images show that, in general, SAMALNet achieves better experimental results compared with UPS-Net, SegNet, DeepLabv3, UNet, DeepLabv3+, and PSPNet network structures, including recall 96.34%, precision 95.91%, F1 score 96.12%, and MIoU 92.60%. SAMALNet achieved better results extracting aquaculture area boundaries than the other network structures listed above. The high accuracy of the aquaculture area can provide data support for the rational planning and environmental protection of the coastal aquaculture area and promote more rational usage of the coastal aquaculture area.
引用
收藏
页码:2250 / 2261
页数:12
相关论文
共 50 条
  • [21] A Text Sentiment Analysis Model Based on Self-Attention Mechanism
    Ji, Likun
    Gong, Ping
    Yao, Zhuyu
    2019 THE 3RD INTERNATIONAL CONFERENCE ON HIGH PERFORMANCE COMPILATION, COMPUTING AND COMMUNICATIONS (HP3C 2019), 2019, : 33 - 37
  • [22] Bearing Fault Detection Based on Convolutional Self-Attention Mechanism
    Ye, Ruida
    Wang, Weijie
    Ren, Yuan
    Zhang, Keming
    PROCEEDINGS OF 2020 IEEE 2ND INTERNATIONAL CONFERENCE ON CIVIL AVIATION SAFETY AND INFORMATION TECHNOLOGY (ICCASIT), 2020, : 869 - 873
  • [23] Long-Tailed Recognition Based on Self-attention Mechanism
    Feng, Zekai
    Jia, Hong
    Li, Mengke
    ADVANCED INTELLIGENT COMPUTING TECHNOLOGY AND APPLICATIONS, PT II, ICIC 2024, 2024, 14876 : 380 - 391
  • [24] Chinese Entity Relation Extraction Based on Multi-level Gated Recurrent Mechanism and Self-attention
    Zhong, Zicheng
    PROCEEDINGS OF 2021 2ND INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND INFORMATION SYSTEMS (ICAIIS '21), 2021,
  • [25] Auxiliary Information Guided Self-attention for Image Quality Assessment
    Yang, Jifan
    Wang, Zhongyuan
    Wang, Guangcheng
    Huang, Baojin
    Yang, Yuhong
    Tu, Weiping
    ACM TRANSACTIONS ON MULTIMEDIA COMPUTING COMMUNICATIONS AND APPLICATIONS, 2024, 20 (04)
  • [26] 3D Object Detection Based on Voxel Self-Attention Auxiliary Networks
    Cao, Jie
    Peng, Yiqiang
    Fan, Likang
    Wang, Longfei
    LASER & OPTOELECTRONICS PROGRESS, 2024, 61 (24)
  • [27] Bathymetry estimation for coastal regions using self-attention
    Zhang, Xiaoxiong
    Al Shehhi, Maryam R.
    SCIENTIFIC REPORTS, 2025, 15 (01):
  • [28] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu S.
    Wei J.
    Liu G.
    Zhou B.
    PeerJ Computer Science, 2023, 9
  • [29] Linear Complexity Randomized Self-attention Mechanism
    Zheng, Lin
    Wang, Chong
    Kong, Lingpeng
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 162, 2022,
  • [30] Image classification model based on large kernel attention mechanism and relative position self-attention mechanism
    Liu, Siqi
    Wei, Jiangshu
    Liu, Gang
    Zhou, Bei
    PEERJ COMPUTER SCIENCE, 2023, 9