M U-Net: Intestine Segmentation Using Multi-dimensional Features for Ileus Diagnosis Assistance

被引:0
|
作者
An, Qin [1 ]
Oda, Hirohisa [2 ]
Hayashi, Yuichiro [1 ]
Kitasaka, Takayuki [3 ]
Hinoki, Akinari [4 ]
Uchida, Hiroo [4 ]
Suzuki, Kojiro [5 ]
Takimoto, Aitaro [4 ]
Oda, Masahiro [1 ,6 ]
Mori, Kensaku [1 ,7 ,8 ]
机构
[1] Nagoya Univ, Grad Sch Informat, Nagoya, Aichi, Japan
[2] Univ Shizuoka, Sch Management & Informat, Shizuoka, Japan
[3] Aichi Inst Technol, Sch Informat Sci, Toyota, Japan
[4] Nagoya Univ, Grad Sch Med, Nagoya, Aichi, Japan
[5] Aichi Med Univ, Dept Radiol, Toyota, Japan
[6] Nagoya Univ, Strategy Off Informat & Communicat, Nagoya, Aichi, Japan
[7] Nagoya Univ, Ctr Informat Technol, Nagoya, Aichi, Japan
[8] Natl Inst Informat, Res Ctr Med Bigdata, Tokyo, Japan
来源
APPLICATIONS OF MEDICAL ARTIFICIAL INTELLIGENCE, AMAI 2023 | 2024年 / 14313卷
关键词
Intestine segmentation; Ileus; Computer-aided diagnosis; Sparse label;
D O I
10.1007/978-3-031-47076-9_14
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The intestine is an essential digestive organ that can cause serious health problems once diseased. This paper proposes a method for intestine segmentation to intestine obstruction diagnosis assistance called multi-dimensional U-Net (M U-Net). We employ two encoders to extract features from two-dimensional (2D) CT slices and three-dimensional (3D) CT patches. These two encoders collaborate to enhance the segmentation accuracy of the model. Additionally, we incorporate deep supervision with the M U-Net to reduce the limitation of training with sparse label data sets. The experimental results demonstrated that the Dice of the proposed method was 73.22%, the recall was 79.89%, and the precision was 70.61%.
引用
收藏
页码:135 / 144
页数:10
相关论文
共 50 条
  • [1] Multi-dimensional consistency learning between 2D Swin U-Net and 3D U-Net for intestine segmentation from CT volume
    Qin An
    Hirohisa Oda
    Yuichiro Hayashi
    Takayuki Kitasaka
    Hiroo Uchida
    Akinari Hinoki
    Kojiro Suzuki
    Aitaro Takimoto
    Masahiro Oda
    Kensaku Mori
    International Journal of Computer Assisted Radiology and Surgery, 2025, 20 (4) : 723 - 733
  • [2] Liver segmentation based on complementary features U-Net
    Junding Sun
    Zhenkun Hui
    Chaosheng Tang
    Xiaosheng Wu
    The Visual Computer, 2023, 39 : 4685 - 4696
  • [3] Liver segmentation based on complementary features U-Net
    Sun, Junding
    Hui, Zhenkun
    Tang, Chaosheng
    Wu, Xiaosheng
    VISUAL COMPUTER, 2023, 39 (10): : 4685 - 4696
  • [4] MVP U-Net: Multi-View Pointwise U-Net for Brain Tumor Segmentation
    Zhao, Changchen
    Zhao, Zhiming
    Zeng, Qingrun
    Feng, Yuanjing
    BRAINLESION: GLIOMA, MULTIPLE SCLEROSIS, STROKE AND TRAUMATIC BRAIN INJURIES (BRAINLES 2020), PT II, 2021, 12659 : 93 - 103
  • [5] U-NET Architecture for Liver Segmentation using Multi Model Scans
    Kumar, Ayush
    Singh, Tripty
    2022 IEEE 19TH INDIA COUNCIL INTERNATIONAL CONFERENCE, INDICON, 2022,
  • [6] Pupil Detection and Segmentation for Diagnosis of Nystagmus with U-Net
    Lee, Yerin
    Lee, Sena
    Jang, Seunghyun
    Wang, Hyeong Jun
    Seo, Young Joon
    Yang, Sejung
    2022 INTERNATIONAL CONFERENCE ON ELECTRONICS, INFORMATION, AND COMMUNICATION (ICEIC), 2022,
  • [7] Mosaic Images Segmentation using U-net
    Fenu, Gianfranco
    Medvet, Eric
    Panfilo, Daniele
    Pellegrino, Felice Andrea
    ICPRAM: PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON PATTERN RECOGNITION APPLICATIONS AND METHODS, 2020, : 485 - 492
  • [8] Multi-Branch U-Net for Interactive Segmentation
    Li, Zhicheng
    Wang, Tao
    Mei, Chun
    Pei, Zhenyu
    IEEE SIGNAL PROCESSING LETTERS, 2024, 31 : 974 - 978
  • [9] Unsupervised Denoising in Spectral CT: Multi-Dimensional U-Net for Energy Channel Regularisation
    Kumrular, Raziye Kubra
    Blumensath, Thomas
    SENSORS, 2024, 24 (20)
  • [10] Hybrid Multi-Dimensional Attention U-Net for Hyperspectral Snapshot Compressive Imaging Reconstruction
    Zheng, Siming
    Zhu, Mingyu
    Chen, Mingliang
    ENTROPY, 2023, 25 (04)