AS-Net: Attention Synergy Network for skin lesion segmentation

被引:38
|
作者
Hu, Kai [1 ,2 ]
Lu, Jing [1 ]
Lee, Dongjin [3 ,4 ]
Xiong, Dapeng [3 ,4 ]
Chen, Zhineng [5 ]
机构
[1] Xiangtan Univ, Key Lab Intelligent Comp & Informat Proc, Minist Educ, Xiangtan 411105, Peoples R China
[2] Xiangnan Univ, Key Lab Med Imaging & Artificial Intelligence Huna, Chenzhou 423000, Peoples R China
[3] Cornell Univ, Dept Computat Biol, Ithaca, NY 14853 USA
[4] Cornell Univ, Weill Inst Cell & Mol Biol, Ithaca, NY 14853 USA
[5] Fudan Univ, Sch Comp Sci, Shanghai 200438, Peoples R China
关键词
Attention mechanism; Skin lesion segmentation; Convolutional neural network; Dermoscopy; DERMOSCOPIC IMAGE SEGMENTATION; CONVOLUTIONAL NEURAL-NETWORK; CANCER; MODEL;
D O I
10.1016/j.eswa.2022.117112
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Accurate skin lesion segmentation in dermoscopic images is crucial to the early diagnosis of skin cancers. However, it remains a challenging task due to fuzzy lesion boundaries, irregular lesion shapes, and the existence of various interference factors. In this paper, a novel Attention Synergy Network (AS-Net) is developed to enhance the discriminative ability for skin lesion segmentation by combining both spatial and channel attention mechanisms. The spatial attention path captures lesion-related features in the spatial dimension while the channel attention path selectively emphasizes discriminative features in the channel dimension. The synergy module is designed to optimally integrate both spatial and channel information, and a weighted binary cross entropy loss function is introduced to emphasize the foreground lesion region. Comprehensive experiments indicate that our proposed model achieves the state-of-the-art performance with the highest overall score in the ISIC2017 challenge, and outperforms several popular deep neural networks on both ISIC2018 and PH2 datasets.
引用
收藏
页数:12
相关论文
共 50 条
  • [21] ASCU-Net: Attention Gate, Spatial and Channel Attention U-Net for Skin Lesion Segmentation
    Tong, Xiaozhong
    Wei, Junyu
    Sun, Bei
    Su, Shaojing
    Zuo, Zhen
    Wu, Peng
    DIAGNOSTICS, 2021, 11 (03)
  • [22] Dense and shuffle attention U-Net for automatic skin lesion segmentation
    Zhang, Guanzhong
    Wang, Shengsheng
    INTERNATIONAL JOURNAL OF IMAGING SYSTEMS AND TECHNOLOGY, 2022, 32 (06) : 2066 - 2079
  • [23] Rema-Net: An efficient multi-attention convolutional neural network for rapid skin lesion segmentation
    Yang, Litao
    Fan, Chao
    Lin, Hao
    Qiu, Yingying
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 159
  • [24] MHAU-Net: Skin Lesion Segmentation Based on Multi-Scale Hybrid Residual Attention Network
    Li, Yingjie
    Xu, Chao
    Han, Jubao
    An, Ziheng
    Wang, Deyu
    Ma, Haichao
    Liu, Chuanxu
    SENSORS, 2022, 22 (22)
  • [25] MaMfi-Net: Multi-attention and multi-feature interaction network in skin lesion segmentation
    Yang, Litao
    Fan, Chao
    Lin, Hao
    Qiu, Yingying
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2024, 96
  • [26] PCF-Net: Position and context information fusion attention convolutional neural network for skin lesion segmentation
    Jiang, Yun
    Dong, Jinkun
    Zhang, Yuan
    Cheng, Tongtong
    Lin, Xin
    Liang, Jing
    HELIYON, 2023, 9 (03)
  • [27] MASDF-Net: A Multi-Attention Codec Network with Selective and Dynamic Fusion for Skin Lesion Segmentation
    Fu, Jinghao
    Deng, Hongmin
    SENSORS, 2024, 24 (16)
  • [28] ACEANet: Ambiguous Context Enhanced Attention Network for skin lesion segmentation
    Jiang, Yun
    Qiao, Hao
    INTELLIGENT DATA ANALYSIS, 2024, 28 (03) : 791 - 805
  • [29] Multi-Conv attention network for skin lesion image segmentation
    Li, Zexin
    Wang, Hanchen
    Chen, Haoyu
    Lin, Chenxin
    Yan, Aochen
    FRONTIERS IN PHYSICS, 2024, 12
  • [30] ATTENTION SWIN U-NET: CROSS-CONTEXTUAL ATTENTION MECHANISM FOR SKIN LESION SEGMENTATION
    Aghdam, Ehsan Khodapanah
    Azad, Reza
    Zarvani, Maral
    Merhof, Dorit
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,