DMAGNet: Dual-path multi-scale attention guided network for medical image segmentation

被引:0
|
作者
Ji, Qiulang [1 ]
Wang, Jihong [1 ,2 ]
Ding, Caifu [1 ]
Wang, Yuhang [1 ]
Zhou, Wen [1 ]
Liu, Zijie [1 ]
Yang, Chen [1 ,2 ]
机构
[1] Guizhou Univ, Coll Big Data & Informat Engn, Power Syst Engn Res Ctr, Minist Educ, Guiyang, Peoples R China
[2] Guizhou Univ, Coll Big Data & Informat Engn, Power Syst Engn Res Ctr, Minist Educ, Guiyang 550025, Peoples R China
基金
中国国家自然科学基金;
关键词
codecs; convolutional neural nets; image processing; image segmentation; CLASSIFICATION; MODEL;
D O I
10.1049/ipr2.12904
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In recent years, convolutional neural networks (CNN)-based automatic segmentation of medical images has become one of the hot topics in clinical disease diagnosis. It is still a challenging task to improve the segmentation accuracy of the network model with the large variation of pathological regions in different patients and the fuzzy boundary of pathological regions. A Dual-path Multi-scale Attention Guided network (DMAGNet) for medical image segmentation is proposed in this paper. First, the Dual-path Multi-scale Attention Fusion Module (DMAF) is proposed as a novel skip connection strategy, which is applied to encode semantic dependencies between high-level and low-level channels. Second, the Multi-scale Normalized Channel Attention Module (MNCA) based on the atrous convolution, normalization channel attention mechanism, and the Depthwise Separable Convolutions (DSConv) is developed to strengthen dependencies between channels. Finally, the encoder-decoder backbone employs the DSConv, as well as the pretrained Resnet34 block is combined in the encoder part to further improve the backbone network performance. Comprehensive experiments on brain, lung, and liver segmentation tasks show that the proposed DMAGNet outperforms the original U-Net method and other advanced methods.
引用
收藏
页码:3631 / 3644
页数:14
相关论文
共 50 条
  • [1] SDPN: A Slight Dual-Path Network With Local-Global Attention Guided for Medical Image Segmentation
    Wang, Jing
    Li, Shuyi
    Yu, Luyue
    Qu, Aixi
    Wang, Qing
    Liu, Ju
    Wu, Qiang
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2023, 27 (06) : 2956 - 2967
  • [2] A Medical Image Segmentation Network with Multi-Scale and Dual-Branch Attention
    Zhu, Cancan
    Cheng, Ke
    Hua, Xuecheng
    APPLIED SCIENCES-BASEL, 2024, 14 (14):
  • [3] Collaborative Attention Guided Multi-Scale Feature Fusion Network for Medical Image Segmentation
    Xu, Zhenghua
    Tian, Biao
    Liu, Shijie
    Wang, Xiangtao
    Yuan, Di
    Gu, Junhua
    Chen, Junyang
    Lukasiewicz, Thomas
    Leung, Victor C. M.
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2024, 11 (02): : 1857 - 1871
  • [5] MCDALNet: Multi-scale Contextual Dual Attention Learning Network for Medical Image Segmentation
    Guo, Pengcheng
    Su, Xiangdong
    Zhang, Haoran
    Bao, Feilong
    2021 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2021,
  • [6] Multi-Scale Self-Guided Attention for Medical Image Segmentation
    Sinha, Ashish
    Dolz, Jose
    IEEE JOURNAL OF BIOMEDICAL AND HEALTH INFORMATICS, 2021, 25 (01) : 121 - 130
  • [7] DS-MSFF-Net: Dual-path self-attention multi-scale feature fusion network for CT image segmentation
    Zhang, Xiaoqian
    Pu, Lei
    Wan, Liming
    Wang, Xiao
    Zhou, Ying
    APPLIED INTELLIGENCE, 2024, 54 (06) : 4490 - 4506
  • [8] DS-MSFF-Net: Dual-path self-attention multi-scale feature fusion network for CT image segmentation
    Xiaoqian Zhang
    Lei Pu
    Liming Wan
    Xiao Wang
    Ying Zhou
    Applied Intelligence, 2024, 54 : 4490 - 4506
  • [9] Dual-path multi-scale context dense aggregation network for retinal vessel segmentation
    Zhou, Wei
    Bai, Weiqi
    Ji, Jianhang
    Yi, Yugen
    Zhang, Ningyi
    Cui, Wei
    COMPUTERS IN BIOLOGY AND MEDICINE, 2023, 164
  • [10] MSAANet: Multi-scale Axial Attention Network for medical image segmentation
    Zeng, Hao
    Shan, Xinxin
    Feng, Yu
    Wen, Ying
    2023 IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, ICME, 2023, : 2291 - 2296