Real-time deep neural network-based automatic bowel gas segmentation on X-ray images for particle beam treatment

被引:1
|
作者
Kumakiri, Toshio [1 ,2 ]
Mori, Shinichiro [1 ,5 ]
Mori, Yasukuni [3 ]
Hirai, Ryusuke [2 ]
Hashimoto, Ayato [1 ,2 ]
Tachibana, Yasuhiko [1 ]
Suyari, Hiroki [3 ]
Ishikawa, Hitoshi [4 ]
机构
[1] Natl Inst Quantum Sci & Technol, Inst Quantum Med Sci, Inage ku, Chiba 2638555, Japan
[2] Chiba Univ, Grad Sch Sci & Engn, Inage ku, Chiba 2638522, Japan
[3] Chiba Univ, Grad Sch Engn, Inage ku, Chiba 2638522, Japan
[4] Natl Inst Quantum Sci & Technol, QST Hosp, Inage ku, Chiba 2638555, Japan
[5] Natl Inst Radiol Sci, Res Ctr Charged Particle Therapy, Inage ku, Chiba 2638555, Japan
关键词
Deep neural network; Image segmentation; Patient setup; Particle beam therapy; Bowel gas; CARBON ION RADIOTHERAPY; PROSTATE-CANCER;
D O I
10.1007/s13246-023-01240-9
中图分类号
R318 [生物医学工程];
学科分类号
0831 ;
摘要
Since particle beam distribution is vulnerable to change in bowel gas because of its low density, we developed a deep neural network (DNN) for bowel gas segmentation on X-ray images. We used 6688 image datasets from 209 cases as training data, 736 image datasets from 23 cases as validation data and 102 image datasets from 51 cases as test data (total 283 cases). For the training data, we prepared three types of digitally reconstructed radiographic (DRR) images (all-density, bone and gas) by projecting the treatment planning CT image data. However, the real X-ray images acquired in the treatment room showed low contrast that interfered with manual delineation of bowel gas. Therefore, we used synthetic X-ray images converted from DRR images in addition to real X-ray images.We evaluated DNN segmentation accuracy for the synthetic X-ray images using Intersection over Union, recall, precision, and the Dice coefficient, which measured 0.708 +/- 0.208, 0.832 +/- 0.170, 0.799 +/- 0.191, and 0.807 +/- 0.178, respectively. The evaluation metrics for the real X-images were less accurate than those for the synthetic X-ray images (0.408 +/- 0237, 0.685 +/- 0.326, 0.490 +/- 0272, and 0.534 +/- 0.271, respectively). Computation time was 29.7 +/- 1.3 ms/image. Our DNN appears useful in increasing treatment accuracy in particle beam therapy.
引用
收藏
页码:659 / 668
页数:10
相关论文
共 50 条
  • [21] Deep Scatter Estimation (DSE): Feasibility of Using a Deep Convolutional Neural Network for Real-Time X-Ray Scatter Prediction in Cone-Beam CT
    Maier, Joscha
    Berker, Yannick
    Sawall, Stefan
    Kachelriess, Marc
    MEDICAL IMAGING 2018: PHYSICS OF MEDICAL IMAGING, 2018, 10573
  • [22] Deep Neural Network for Lung Image Segmentation on Chest X-ray
    Chavan, Mahesh
    Varadarajan, Vijayakumar
    Gite, Shilpa
    Kotecha, Ketan
    TECHNOLOGIES, 2022, 10 (05)
  • [23] Automatic Segmentation of Bones in X-ray Images Based on Entropy Measure
    Bandyopadhyay, Oishila
    Chanda, Bhabatosh
    Bhattacharya, Bhargab B.
    INTERNATIONAL JOURNAL OF IMAGE AND GRAPHICS, 2016, 16 (01)
  • [24] Region-Based Convolutional Neural Network-Based Spine Model Positioning of X-Ray Images
    Zhang, Le
    Zhang, Jiabao
    Gao, Song
    BIOMED RESEARCH INTERNATIONAL, 2022, 2022
  • [25] X-RAY PROJECTOR PRODUCES REAL-TIME TV IMAGES
    SPIERS, S
    INDUSTRIAL RESEARCH & DEVELOPMENT, 1981, 23 (09): : 154 - 157
  • [26] NOISE-REDUCTION IN REAL-TIME X-RAY IMAGES
    TSUDA, M
    KIMURA, Y
    JAPANESE JOURNAL OF APPLIED PHYSICS PART 1-REGULAR PAPERS SHORT NOTES & REVIEW PAPERS, 1986, 25 (06): : 891 - 901
  • [27] COMPUTER EVALUATION OF REAL-TIME X-RAY AND ACOUSTIC IMAGES
    JACOBY, MH
    LOE, RS
    DONDES, PA
    PROCEEDINGS OF THE SOCIETY OF PHOTO-OPTICAL INSTRUMENTATION ENGINEERS, 1982, 359 : 273 - 277
  • [28] Real-time in-situ X-ray beam diagnostics
    Kachatkou, Anton
    Kyele, Nicholas
    Scott, Peter
    van Silfhout, Roelof
    11TH INTERNATIONAL CONFERENCE ON SYNCHROTRON RADIATION INSTRUMENTATION (SRI 2012), 2013, 425
  • [29] Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-based deep learning approach
    Daichi Kitaguchi
    Nobuyoshi Takeshita
    Hiroki Matsuzaki
    Hiroaki Takano
    Yohei Owada
    Tsuyoshi Enomoto
    Tatsuya Oda
    Hirohisa Miura
    Takahiro Yamanashi
    Masahiko Watanabe
    Daisuke Sato
    Yusuke Sugomori
    Seigo Hara
    Masaaki Ito
    Surgical Endoscopy, 2020, 34 : 4924 - 4931
  • [30] Real-time automatic surgical phase recognition in laparoscopic sigmoidectomy using the convolutional neural network-based deep learning approach
    Kitaguchi, Daichi
    Takeshita, Nobuyoshi
    Matsuzaki, Hiroki
    Takano, Hiroaki
    Owada, Yohei
    Enomoto, Tsuyoshi
    Oda, Tatsuya
    Miura, Hirohisa
    Yamanashi, Takahiro
    Watanabe, Masahiko
    Sato, Daisuke
    Sugomori, Yusuke
    Hara, Seigo
    Ito, Masaaki
    SURGICAL ENDOSCOPY AND OTHER INTERVENTIONAL TECHNIQUES, 2020, 34 (11): : 4924 - 4931