Quantitative susceptibility mapping using deep neural network: QSMnet

被引:117
|
作者
Yoon, Jaeyeon [1 ]
Gong, Enhao [2 ,3 ]
Chatnuntawech, Itthi [4 ]
Bilgic, Berkin [5 ]
Lee, Jingu [1 ]
Jung, Woojin [1 ]
Ko, Jingyu [1 ]
Jung, Hosan [1 ]
Setsompop, Kawin [5 ]
Zaharchuk, Greg [3 ]
Kim, Eung Yeop [6 ]
Pauly, John [2 ]
Lee, Jongho [1 ]
机构
[1] Seoul Natl Univ, Dept Elect & Comp Engn, Lab Imaging Sci & Technol, Seoul, South Korea
[2] Stanford Univ, Dept Elect Engn, Stanford, CA 94305 USA
[3] Stanford Univ, Dept Radiol, Stanford, CA 94305 USA
[4] Natl Nanotechnol Ctr, Pathum Thani, Thailand
[5] Harvard Med Sch, Dept Radiol, Boston, MA USA
[6] Gachon Univ, Gil Med Ctr, Dept Radiol, Coll Med, Incheon, South Korea
基金
新加坡国家研究基金会;
关键词
QSM; Machine learning; Reconstruction; Magnetic susceptibility; Dipole; MRI; ENABLED DIPOLE INVERSION; BRAIN; ORIENTATION; MRI; IMAGE; RECONSTRUCTION; CONTRAST; ROBUST; FIELD; OPTIMIZATION;
D O I
10.1016/j.neuroimage.2018.06.030
中图分类号
Q189 [神经科学];
学科分类号
071006 ;
摘要
Deep neural networks have demonstrated promising potential for the field of medical image reconstruction, successfully generating high quality images for CT, PET and MRI. In this work, an MRI reconstruction algorithm, which is referred to as quantitative susceptibility mapping (QSM), has been developed using a deep neural network in order to perform dipole deconvolution, which restores magnetic susceptibility source from an MRI field map. Previous approaches of QSM require multiple orientation data (e.g. Calculation of Susceptibility through Multiple Orientation Sampling or COSMOS) or regularization terms (e.g. Truncated K-space Division or TKD; Morphology Enabled Dipole Inversion or MEDI) to solve an ill-conditioned dipole deconvolution problem. Unfortunately, they either entail challenges in data acquisition (i.e. long scan time and multiple head orientations) or suffer from image artifacts. To overcome these shortcomings, a deep neural network, which is referred to as QSMnet, is constructed to generate a high quality susceptibility source map from single orientation data. The network has a modified U-net structure and is trained using COSMOS QSM maps, which are considered as gold standard. Five head orientation datasets from five subjects were employed for patch-wise network training after doubling the training data using a model-based data augmentation. Seven additional datasets of five head orientation images (i.e. total 35 images) were used for validation (one dataset) and test (six datasets). The QSMnet maps of the test dataset were compared with the maps from TKD and MEDI for their image quality and consistency with respect to multiple head orientations. Quantitative and qualitative image quality comparisons demonstrate that the QSMnet results have superior image quality to those of TKD or MEDI results and have comparable image quality to those of COSMOS. Additionally, QSMnet maps reveal substantially better consistency across the multiple head orientation data than those from TKD or MEDI. As a preliminary application, the network was further tested for three patients, one with microbleed, another with multiple sclerosis lesions, and the third with hemorrhage. The QSMnet maps showed similar lesion contrasts with those from MEDI, demonstrating potential for future applications.
引用
收藏
页码:199 / 206
页数:8
相关论文
共 50 条
  • [1] Landslide Susceptibility Mapping Using Deep Neural Network and Convolutional Neural Network
    Gong, Sung-Hyun
    Baek, Won-Kyung
    Jung, Hyung-Sup
    KOREAN JOURNAL OF REMOTE SENSING, 2022, 38 (06) : 1723 - 1735
  • [2] Affine transformation edited and refined deep neural network for quantitative susceptibility mapping
    Xiong, Zhuang
    Gao, Yang
    Liu, Feng
    Sun, Hongfu
    NEUROIMAGE, 2023, 267
  • [3] SHARQnet - Sophisticated harmonic artifact reduction in quantitative susceptibility mapping using a deep convolutional neural network
    Bollmann, Steffen
    Kristensen, Matilde Holm
    Larsen, Morten Skaarup
    Olsen, Mathias Vassard
    Pedersen, Mads Jozwiak
    Ostergaard, Lasse Riis
    O'Brien, Kieran
    Langkammer, Christian
    FazIollahi, Amir
    Barth, Markus
    ZEITSCHRIFT FUR MEDIZINISCHE PHYSIK, 2019, 29 (02): : 139 - 149
  • [4] A preliminary attempt to visualize nigrosome 1 in the substantia nigra for Parkinson's disease at 3T: An efficient susceptibility map-weighted imaging (SMWI) with quantitative susceptibility mapping using deep neural network (QSMnet)
    Jo, Minju
    Oh, Se-Hong
    MEDICAL PHYSICS, 2020, 47 (03) : 1151 - 1160
  • [5] Exploring linearity of deep neural network trained QSM: QSMnet
    Jung, Woojin
    Yoon, Jaeyeon
    Choi, Joon Yul
    Kim, Jae Myung
    Nam, Yoonho
    Kim, Eung Yeop
    Lee, Jongho
    Ji, Sooyeon
    NEUROIMAGE, 2020, 211
  • [6] Accelerating quantitative susceptibility and R2*mapping using incoherent undersampling and deep neural network reconstruction
    Gao, Yang
    Cloos, Martijn
    Liu, Feng
    Crozier, Stuart
    Pike, G. Bruce
    Sun, Hongfu
    NEUROIMAGE, 2021, 240
  • [7] Flood susceptibility mapping in Brahmaputra floodplain of Bangladesh using deep boost, deep learning neural network, and artificial neural network
    Ahmed, Naser
    Hoque, Muhammad Al-Amin
    Arabameri, Alireza
    Pal, Subodh Chandra
    Chakrabortty, Rabin
    Jui, Jesmin
    GEOCARTO INTERNATIONAL, 2022, 37 (25) : 8770 - 8791
  • [8] Landslide spatial susceptibility mapping by using deep belief network
    Chen, Tao
    Zhong, Ziying
    Niu, Ruiqing
    2018 FIFTH INTERNATIONAL WORKSHOP ON EARTH OBSERVATION AND REMOTE SENSING APPLICATIONS (EORSA), 2018, : 364 - 368
  • [9] Flood susceptibility mapping using convolutional neural network frameworks
    Wang, Yi
    Fang, Zhice
    Hong, Haoyuan
    Peng, Ling
    JOURNAL OF HYDROLOGY, 2020, 582
  • [10] A subject-specific unsupervised deep learning method for quantitative susceptibility mapping using implicit neural representation
    Zhang, Ming
    Feng, Ruimin
    Li, Zhenghao
    Feng, Jie
    Wu, Qing
    Zhang, Zhiyong
    Ma, Chengxin
    Wu, Jinsong
    Yan, Fuhua
    Liu, Chunlei
    Zhang, Yuyao
    Wei, Hongjiang
    MEDICAL IMAGE ANALYSIS, 2024, 95