DARMF-UNet: A dual-branch attention-guided refinement network with multi-scale features fusion U-Net for gland segmentation

被引:2
|
作者
Sun, Junmei [1 ]
Zhang, Xin [1 ]
Li, Xiumei [1 ]
Liu, Ruyu [1 ]
Wang, Tianyang [1 ]
机构
[1] Hangzhou Normal Univ, Sch Informat Sci & Technol, Hangzhou, Peoples R China
基金
中国国家自然科学基金;
关键词
Gland segmentation; Deep supervision; Multi-scale fusion network; Coordinate parallel attention mechanism; Dense atrous convolution block; ARCHITECTURE; CONNECTIONS; IMAGES; MODEL;
D O I
10.1016/j.compbiomed.2023.107218
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Accurate gland segmentation is critical in determining adenocarcinoma. Automatic gland segmentation methods currently suffer from challenges such as less accurate edge segmentation, easy mis-segmentation, and incomplete segmentation. To solve these problems, this paper proposes a novel gland segmentation network Dual-branch Attention-guided Refinement and Multi-scale Features Fusion U-Net (DARMF-UNet), which fuses multi-scale features using deep supervision. At the first three layers of feature concatenation, a Coordinate Parallel Attention (CPA) is proposed to guide the network to focus on the key regions. A Dense Atrous Convolution (DAC) block is used in the fourth layer of feature concatenation to perform multi-scale features extraction and obtain global information. A hybrid loss function is adopted to calculate the loss of each segmentation result of the network to achieve deep supervision and improve the accuracy of segmentation. Finally, the segmentation results at different scales in each part of the network are fused to obtain the final gland segmentation result. The experimental results on the gland datasets Warwick-QU and Crag show that the network improves in terms of the evaluation metrics of F1 Score, Object Dice, Object Hausdorff, and the segmentation effect is better than the state-of-the-art network models.
引用
收藏
页数:12
相关论文
共 50 条
  • [1] DAG-Net: Dual-Branch Attention-Guided Network for Multi-Scale Information Fusion in Lung Nodule Segmentation
    Zhang, Bojie
    Zhu, Hongqing
    Wang, Ziying
    Luo, Lan
    Yu, Yang
    [J]. International Journal of Imaging Systems and Technology, 2024, 34 (06)
  • [2] A Medical Image Segmentation Network with Multi-Scale and Dual-Branch Attention
    Zhu, Cancan
    Cheng, Ke
    Hua, Xuecheng
    [J]. APPLIED SCIENCES-BASEL, 2024, 14 (14):
  • [3] An attention-guided multi-scale fusion network for surgical instrument segmentation
    Song, Mengqiu
    Zhai, Chenxu
    Yang, Lei
    Liu, Yanhong
    Bian, Guibin
    [J]. Biomedical Signal Processing and Control, 2025, 102
  • [4] Attention-Guided Multi-Scale Fusion Network for Similar Objects Semantic Segmentation
    Yao, Fengqin
    Wang, Shengke
    Ding, Laihui
    Zhong, Guoqiang
    Li, Shu
    Xu, Zhiwei
    [J]. COGNITIVE COMPUTATION, 2024, 16 (01) : 366 - 376
  • [5] Attention-Guided Multi-Scale Fusion Network for Similar Objects Semantic Segmentation
    Fengqin Yao
    Shengke Wang
    Laihui Ding
    Guoqiang Zhong
    Shu Li
    Zhiwei Xu
    [J]. Cognitive Computation, 2024, 16 : 366 - 376
  • [6] MDA-Unet: A Multi-Scale Dilated Attention U-Net for Medical Image Segmentation
    Amer, Alyaa
    Lambrou, Tryphon
    Ye, Xujiong
    [J]. APPLIED SCIENCES-BASEL, 2022, 12 (07):
  • [7] DI-UNet: dual-branch interactive U-Net for skin cancer image segmentation
    Wen Yin
    Dongming Zhou
    Rencan Nie
    [J]. Journal of Cancer Research and Clinical Oncology, 2023, 149 : 15511 - 15524
  • [8] DI-UNet: dual-branch interactive U-Net for skin cancer image segmentation
    Yin, Wen
    Zhou, Dongming
    Nie, Rencan
    [J]. JOURNAL OF CANCER RESEARCH AND CLINICAL ONCOLOGY, 2023, 149 (17) : 15511 - 15524
  • [9] Multi-Scale Fusion U-Net for the Segmentation of Breast Lesions
    Li, Jingyao
    Cheng, Lianglun
    Xia, Tingjian
    Ni, Haomin
    Li, Jiao
    [J]. IEEE ACCESS, 2021, 9 : 137125 - 137139
  • [10] Double-branch U-Net for multi-scale organ segmentation
    Liu, Yuhao
    Qin, Caijie
    Yu, Zhiqian
    Yang, Ruijie
    Suqing, Tian
    Liu, Xia
    Ma, Xibo
    [J]. METHODS, 2022, 205 : 220 - 225