An improved PET image reconstruction method based on super-resolution

被引:1
|
作者
Wang, Ying [1 ,2 ]
Zhang, Xuezhu [3 ]
Zhang, Mengxi [3 ]
Liang, Dong [1 ,4 ]
Liu, Xin [1 ,4 ]
Zheng, Hairong [1 ,4 ]
Yang, Yongfeng [1 ,4 ]
Hu, Zhanli [1 ,4 ]
机构
[1] Chinese Acad Sci, Lauterbur Res Ctr Biomed Imaging, Shenzhen Inst Adv Technol, Shenzhen 518055, Guangdong, Peoples R China
[2] Hunan Univ, Coll Elect & Informat Engn, Changsha 410082, Hunan, Peoples R China
[3] Univ Calif Davis, Dept Biomed Engn, Davis, CA 95616 USA
[4] Chinese Acad Sci, Key Lab Hlth Informat, Shenzhen 518055, Guangdong, Peoples R China
基金
中国国家自然科学基金;
关键词
Positron emission tomography; Reconstruction; Patch regularization; Penalized maximum likelihood; Random forests; WEIGHTED LEAST-SQUARES; ALGORITHMS; INTERPOLATION; PERFORMANCE;
D O I
10.1016/j.nima.2019.162677
中图分类号
TH7 [仪器、仪表];
学科分类号
0804 ; 080401 ; 081102 ;
摘要
Positron emission tomography (PET) is a non-invasive high-end examination that can quantitatively detect early disease stages. It complements information provided by functional and anatomical imaging. Therefore, PET is widely used clinically early on in the process of diagnosing malignant tumors or lesions. Fast and accurate reconstruction of PET images has been the subject of ongoing research. Patch-based regularization penalty likelihood reconstruction can reconstruct PET images more accurately, but it is sensitive to its algorithm's parameter values and requires a great deal of time to adjust parameters to achieve the best reconstruction. In this paper, we propose a novel method that uses random forests to improve PET imaging resolution at each iteration reconstruction step in the sinogram domain and the image domain; we refer to this method as patch-based super-resolution random forests reconstruction (patch-SRF). The patch-SRF algorithm allows the reconstruction to converge in advance and avoids the free-time adjustment process, achieving better reconstruction results despite relatively poor parameter settings.
引用
收藏
页数:6
相关论文
共 50 条
  • [21] Image Super-resolution Reconstruction Based on an Improved Generative Adversarial Network
    Liu, Han
    Wang, Fan
    Liu, Lijun
    [J]. 2019 1ST INTERNATIONAL CONFERENCE ON INDUSTRIAL ARTIFICIAL INTELLIGENCE (IAI 2019), 2019,
  • [22] Image super-resolution reconstruction based on improved Dirac residual network
    Xin Yang
    Tangxin Xie
    Li Liu
    Dake Zhou
    [J]. Multidimensional Systems and Signal Processing, 2021, 32 : 1065 - 1082
  • [23] Improved hybrid method for image super-resolution
    Xing, Weiwei
    Zhao, Yahui
    Bao, Ergude
    [J]. IET COMPUTER VISION, 2016, 10 (08) : 769 - 779
  • [24] Super-resolution reconstruction of an image
    Elad, M
    Feuer, A
    [J]. NINETEENTH CONVENTION OF ELECTRICAL AND ELECTRONICS ENGINEERS IN ISRAEL, 1996, : 391 - 394
  • [25] Super-resolution image reconstruction
    Kang, MG
    Chaudhuri, S
    [J]. IEEE SIGNAL PROCESSING MAGAZINE, 2003, 20 (03) : 19 - 20
  • [26] Image Super-Resolution Reconstruction Method Based on Sparse Residual Dictionary
    Shao, Zai Yu
    Lu, Zhen Kun
    Chang, Meng Jia
    [J]. PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE OF INFORMATION AND COMMUNICATION TECHNOLOGY [ICICT-2019], 2019, 154 : 629 - 635
  • [27] An infrared image super-resolution reconstruction method based on compressive sensing
    Mao, Yuxing
    Wang, Yan
    Zhou, Jintao
    Jia, Haiwei
    [J]. INFRARED PHYSICS & TECHNOLOGY, 2016, 76 : 735 - 739
  • [28] Research on WGAN-based Image Super-resolution Reconstruction Method
    Chen, Xinying
    Lv, Shuo
    Qian, Chunlin
    [J]. IAENG International Journal of Computer Science, 2023, 50 (03)
  • [29] Modified sparse representation based image super-resolution reconstruction method
    Shang, Li
    Liu, Shu-fen
    Zhou, Yan
    Sun, Zhan-li
    [J]. NEUROCOMPUTING, 2017, 228 : 37 - 52
  • [30] An Infrared Image Super-resolution Reconstruction Method Based on Compressive Sensing
    Mao, Yuxing
    Wang, Yan
    Zhou, Jintao
    Jia, Haiwei
    [J]. PROCEEDINGS OF THE 2016 4TH INTERNATIONAL CONFERENCE ON MACHINERY, MATERIALS AND COMPUTING TECHNOLOGY, 2016, 60 : 1243 - 1250