Focus U-Net: A novel dual attention-gated CNN for polyp segmentation during colonoscopy

被引:79
|
作者
Yeung, Michael [1 ,2 ]
Sala, Evis [1 ,3 ]
Schonlieb, Carola-Bibiane [4 ]
Rundo, Leonardo [1 ,3 ]
机构
[1] Univ Cambridge, Dept Radiol, Box 218,Cambridge Biomed Campus, Cambridge CB2 0QQ, England
[2] Univ Cambridge, Sch Clin Med, Cambridge CB2 0SP, England
[3] Univ Cambridge, Canc Res UK Cambridge Ctr, Cambridge CB2 0RE, England
[4] Univ Cambridge, Dept Appl Math & Theoret Phys, Cambridge CB3 0WA, England
基金
英国工程与自然科学研究理事会; 英国惠康基金; 欧盟地平线“2020”; 英国科学技术设施理事会;
关键词
Polyp segmentation; Colorectal cancer; Colonoscopy; Computer-aided diagnosis; Focus U-Net; Attention mechanisms; Loss function; COLORECTAL-CANCER; MISS RATE; NETWORKS; RISK;
D O I
10.1016/j.compbiomed.2021.104815
中图分类号
Q [生物科学];
学科分类号
07 ; 0710 ; 09 ;
摘要
Background: Colonoscopy remains the gold-standard screening for colorectal cancer. However, significant miss rates for polyps have been reported, particularly when there are multiple small adenomas. This presents an opportunity to leverage computer-aided systems to support clinicians and reduce the number of polyps missed. Method: In this work we introduce the Focus U-Net, a novel dual attention-gated deep neural network, which combines efficient spatial and channel-based attention into a single Focus Gate module to encourage selective learning of polyp features. The Focus U-Net incorporates several further architectural modifications, including the addition of short-range skip connections and deep supervision. Furthermore, we introduce the Hybrid Focal loss, a new compound loss function based on the Focal loss and Focal Tversky loss, designed to handle classimbalanced image segmentation. For our experiments, we selected five public datasets containing images of polyps obtained during optical colonoscopy: CVC-ClinicDB, Kvasir-SEG, CVC-ColonDB, ETIS-Larib PolypDB and EndoScene test set. We first perform a series of ablation studies and then evaluate the Focus U-Net on the CVCClinicDB and Kvasir-SEG datasets separately, and on a combined dataset of all five public datasets. To evaluate model performance, we use the Dice similarity coefficient (DSC) and Intersection over Union (IoU) metrics. Results: Our model achieves state-of-the-art results for both CVC-ClinicDB and Kvasir-SEG, with a mean DSC of 0.941 and 0.910, respectively. When evaluated on a combination of five public polyp datasets, our model similarly achieves state-of-the-art results with a mean DSC of 0.878 and mean IoU of 0.809, a 14% and 15% improvement over the previous state-of-the-art results of 0.768 and 0.702, respectively. Conclusions: This study shows the potential for deep learning to provide fast and accurate polyp segmentation results for use during colonoscopy. The Focus U-Net may be adapted for future use in newer non-invasive colorectal cancer screening and more broadly to other biomedical image segmentation tasks similarly involving class imbalance and requiring efficiency.
引用
收藏
页数:11
相关论文
共 50 条
  • [41] DRA U-Net: An Attention based U-Net Framework for 2D Medical Image Segmentation
    Zhang, Xian
    Feng, Ziyuan
    Zhong, Tianchi
    Shen, Sicheng
    Zhang, Ruolin
    Zhou, Lijie
    Zhang, Bo
    Wang, Wendong
    2021 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2021, : 3936 - 3942
  • [42] U-Net CSF Cells Segmentation Based on Attention Mechanism
    Dai, Yin
    Liu, Wei-Bin
    Dong, Xin-Yang
    Song, Yu-Meng
    Dongbei Daxue Xuebao/Journal of Northeastern University, 2022, 43 (07): : 944 - 950
  • [43] Allergy Wheal and Erythema Segmentation Using Attention U-Net
    Lee, Yul Hee
    Shim, Ji-Su
    Kim, Young Jae
    Jeon, Ji Soo
    Kang, Sung-Yoon
    Lee, Sang Pyo
    Lee, Sang Min
    Kim, Kwang Gi
    JOURNAL OF IMAGING INFORMATICS IN MEDICINE, 2024, : 467 - 475
  • [44] Attention Convolutional U-Net for Automatic Liver Tumor Segmentation
    Bibi, Asima
    Khan, Muhammad Salman
    2021 INTERNATIONAL CONFERENCE ON FRONTIERS OF INFORMATION TECHNOLOGY (FIT 2021), 2021, : 102 - 107
  • [45] Segmentation of Mammogram Images Using U-Net with Fusion of Channel and Spatial Attention Modules (U-Net CASAM)
    Robert Singh, A.
    Vidya, S.
    Hariharasitaraman, S.
    Athisayamani, Suganya
    Hsu, Fang Rong
    Lecture Notes in Networks and Systems, 2024, 966 LNNS : 435 - 448
  • [46] Evaluation of U-Net CNN Approaches for Human Neck MRI Segmentation
    Al Suman, Abdulla
    Khemchandani, Yash
    Asikuzzaman, Md
    Webb, Alexandra Louise
    Perriman, Diana M.
    Tahtali, Murat
    Pickering, Mark R.
    2020 DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA), 2020,
  • [47] Study on Echocardiographic Image Segmentation Based on Attention U-Net
    Wang, Kai
    Zhang, Jiwei
    Hachiya, Hirotaka
    Wu, Haiyuan
    PROCEEDINGS OF 2022 IEEE INTERNATIONAL CONFERENCE ON MECHATRONICS AND AUTOMATION (IEEE ICMA 2022), 2022, : 1091 - 1096
  • [48] Brain Tumor Segmentation with Attention-based U-Net
    Li, Tuofu
    Liu, Javin Jia
    Tai, Yintao
    Tian, Yuxuan
    SECOND IYSF ACADEMIC SYMPOSIUM ON ARTIFICIAL INTELLIGENCE AND COMPUTER ENGINEERING, 2021, 12079
  • [49] Fabric pilling image segmentation by embedding dual-attention mechanism U-Net network
    Yan, Yu
    Tan, Yanjun
    Gao, Pengfu
    Yu, Qiuyu
    Deng, Yuntao
    TEXTILE RESEARCH JOURNAL, 2024, 94 (21-22) : 2434 - 2444
  • [50] AResU-Net: Attention Residual U-Net for Brain Tumor Segmentation
    Zhang, Jianxin
    Lv, Xiaogang
    Zhang, Hengbo
    Liu, Bin
    SYMMETRY-BASEL, 2020, 12 (05):