Denoising of motion artifacted MRI scans using conditional generative adversarial network

被引:2
|
作者
Tripathi, Vijay R. R. [1 ]
Tibdewal, Manish N. N. [1 ,2 ]
Mishra, Ravi [1 ,3 ]
机构
[1] GH Raisoni Univ, Amaravati 444701, India
[2] Shri St Gajanan Maharaj Coll Engn, Shegoan, India
[3] GH Raisoni Inst Engn & Technol, Nagpur, India
关键词
CGAN; Motion artifacted images; Pix2Pix;
D O I
10.1007/s11042-023-15705-2
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Patient motion causes image distortion during Magnetic Resonance Imaging (MRI) capture. These distorted motion artifact induced MRI scans are difficult to read and sometimes lead to a faulty diagnosis. The simplest solution to remove these artifacts in motion-blurred scans is to re-scan the patient. But this method is costly, time-consuming, and cannot guarantee a successful scan even after re-scanning the MRI, because the patient can still move involuntarily. Hence, correction in the motion artifact induced images is an important part of the medical imaging domain. Here we have modified a well-known conditional Generative Adversarial Network called Pix2Pix for removing motion artifacts from MRI scans. We have modified the structure of the original network and fine-tuned the parameters with the help of a database that we have created. The database was collected from a local hospital consisting of 436 images that include motion artifact induced scans and their corresponding artifact-free scans obtained by re-scanning the patients. The proposed method could achieve RMSE of 0.004 and PSNR of 27.97 dB with accuracy greater than 96 %. We expect that this method will help radiologists to save time and cost of re-scanning and will eventually help the doctors in diagnosis.
引用
收藏
页码:11923 / 11941
页数:19
相关论文
共 50 条
  • [21] Digital radiography image denoising using a generative adversarial network
    Sun, Yuewen
    Liu, Ximing
    Cong, Peng
    Li, Litao
    Zhao, Zhongwei
    JOURNAL OF X-RAY SCIENCE AND TECHNOLOGY, 2018, 26 (04) : 523 - 534
  • [22] A Capsule Conditional Generative Adversarial Network
    Chang, Jieh-Ren
    Chen, You-Shyang
    Bao Yipeng
    Hsu, Tzu-Lin
    2020 25TH INTERNATIONAL CONFERENCE ON TECHNOLOGIES AND APPLICATIONS OF ARTIFICIAL INTELLIGENCE (TAAI 2020), 2020, : 175 - 180
  • [23] A New ECG Denoising Framework Using Generative Adversarial Network
    Singh, Pratik
    Pradhan, Gayadhar
    IEEE-ACM TRANSACTIONS ON COMPUTATIONAL BIOLOGY AND BIOINFORMATICS, 2021, 18 (02) : 759 - 764
  • [24] Denoising Optical Coherence Tomography Images Using Conditional Generative Adversarial Networks
    Hu, Dewei
    Atay, Yigit
    Malone, Joseph
    Tao, Yuankai
    Oguz, Ipek
    INVESTIGATIVE OPHTHALMOLOGY & VISUAL SCIENCE, 2020, 61 (07)
  • [25] Ground motion model for acceleration response spectra using conditional-generative adversarial network
    Pavan Mohan Neelamraju
    Jahnabi Basu
    S. T. G. Raghukanth
    Natural Hazards, 2025, 121 (4) : 4865 - 4900
  • [26] Brain MRI motion artifact reduction using 3D conditional generative adversarial networks on simulated motion
    Ghaffari, Mina
    Pawar, Kamlesh
    Oliver, Ruth
    2021 INTERNATIONAL CONFERENCE ON DIGITAL IMAGE COMPUTING: TECHNIQUES AND APPLICATIONS (DICTA 2021), 2021, : 253 - 259
  • [27] Modeling and Simulation of Sidescan Using Conditional Generative Adversarial Network
    Bore, Nils
    Folkesson, John
    IEEE JOURNAL OF OCEANIC ENGINEERING, 2021, 46 (01) : 195 - 205
  • [28] Using a Conditional Generative Adversarial Network (cGAN) for Prostate Segmentation
    Grall, Amelie
    Hamidinekoo, Azam
    Malcolm, Paul
    Zwiggelaar, Reyer
    MEDICAL IMAGE UNDERSTANDING AND ANALYSIS, MIUA 2019, 2020, 1065 : 15 - 25
  • [29] Seismic Impedance Inversion Using Conditional Generative Adversarial Network
    Meng, Delin
    Wu, Bangyu
    Wang, Zhiguo
    Zhu, Zhaolin
    IEEE GEOSCIENCE AND REMOTE SENSING LETTERS, 2022, 19
  • [30] Seismic Impedance Inversion Using Conditional Generative Adversarial Network
    Meng, Delin
    Wu, Bangyu
    Wang, Zhiguo
    Zhu, Zhaolin
    IEEE Geoscience and Remote Sensing Letters, 2022, 19