Optimizing the frame duration for data-driven rigid motion estimation in brain PET imaging

被引:11
|
作者
Spangler-Bickell, Matthew G. [1 ,2 ]
Hurley, Samuel A. [1 ]
Deller, Timothy W. [2 ]
Jansen, Floris [2 ]
Bettinardi, Valentino [3 ]
Carlson, Mackenzie [4 ]
Zeineh, Michael [4 ]
Zaharchuk, Greg [4 ]
McMillan, Alan B. [1 ]
机构
[1] Univ Wisconsin, Dept Radiol, Madison, WI 53706 USA
[2] GE Healthcare, PET MR Engn, Waukesha, WI 53188 USA
[3] Nucl Med Unit, IRCCS Osped San Raffaele, Milan, Italy
[4] Stanford Univ, Dept Radiol, Stanford, CA 94305 USA
关键词
brain imaging; data-driven motion estimation; list-mode; PET reconstruction; rigid motion correction; ultrashort frames; HEAD MOTION; REGISTRATION; RECONSTRUCTION; PERFORMANCE; IMAGES;
D O I
10.1002/mp.14889
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
Purpose: Data-driven rigid motion estimation for PET brain imaging is usually performed using data frames sampled at low temporal resolution to reduce the overall computation time and to provide adequate signal-to-noise ratio in the frames. In recent work it has been demonstrated that list-mode reconstructions of ultrashort frames are sufficient for motion estimation and can be performed very quickly. In this work we take the approach of using image-based registration of reconstructions of very short frames for data-driven motion estimation, and optimize a number of reconstruction and registration parameters (frame duration, MLEM iterations, image pixel size, post-smoothing filter, reference image creation, and registration metric) to ensure accurate registrations while maximizing temporal resolution and minimizing total computation time. Methods: Data from F-18-fluorodeoxyglucose (FDG) and F-18-florbetaben (FBB) tracer studies with varying count rates are analyzed, for PET/MR and PET/CT scanners. For framed reconstructions using various parameter combinations interframe motion is simulated and image-based registrations are performed to estimate that motion. Results: For FDG and FBB tracers using 4 x 10(5) true and scattered coincidence events per frame ensures that 95% of the registrations will be accurate to within 1 mm of the ground truth. This corresponds to a frame duration of 0.5-1 sec for typical clinical PET activity levels. Using four MLEM iterations with no subsets, a transaxial pixel size of 4 mm, a post-smoothing filter with 4-6 mm full width at half maximum, and averaging two or more frames to create the reference image provides an optimal set of parameters to produce accurate registrations while keeping the reconstruction and processing time low. Conclusions: It is shown that very short frames (<= 1 sec) can be used to provide accurate and quick data-driven rigid motion estimates for use in an event-by-event motion corrected reconstruction. (C) 2021 American Association of Physicists in Medicine
引用
收藏
页码:3031 / 3041
页数:11
相关论文
共 50 条
  • [31] Zero-Extra-Dose PET Delayed Imaging with Data-Driven Attenuation Correction Estimation
    Pang, Lifang
    Zhu, Wentao
    Dong, Yun
    Lv, Yang
    Shi, Hongcheng
    MOLECULAR IMAGING AND BIOLOGY, 2019, 21 (01) : 149 - 158
  • [32] Zero-Extra-Dose PET Delayed Imaging with Data-Driven Attenuation Correction Estimation
    Lifang Pang
    Wentao Zhu
    Yun Dong
    Yang Lv
    Hongcheng Shi
    Molecular Imaging and Biology, 2019, 21 : 149 - 158
  • [33] Optimising rigid motion compensation for small animal brain PET imaging
    Spangler-Bickell, Matthew G.
    Zhou, Lin
    Kyme, Andre Z.
    De Laat, Bart
    Fulton, Roger R.
    Nuyts, Johan
    PHYSICS IN MEDICINE AND BIOLOGY, 2016, 61 (19): : 7074 - 7091
  • [34] Prostate cancer imaging with PET/MRI: A study on data-driven bulk patient motion detection and correction
    Bogdanovic, Borjana
    Solari, Esteban Lucas
    Asiares, Alberto Villagran
    Schachoff, Sylvia
    Eiber, Matthias
    Weber, Wolfgang
    Nekolla, Stephan
    JOURNAL OF NUCLEAR MEDICINE, 2021, 62
  • [35] Dual Respiratory and Cardiac Data-driven Gating in PET Imaging
    Qi, Wenyuan
    Yang, Li
    Tsai, Yu-Jung
    Qi, Jinyi
    Asma, Evren
    Kolthammer, Jeffrey
    JOURNAL OF NUCLEAR MEDICINE, 2024, 65
  • [36] Data-driven human model estimation for realtime motion capture
    Su, Le
    Liao, Lianjun
    Zhai, Wenpeng
    Xia, Shihong
    JOURNAL OF VISUAL LANGUAGES AND COMPUTING, 2018, 48 : 10 - 18
  • [37] Data-Driven Methods for the Determination of Anterior-Posterior Motion in PET
    Hess, Mirco
    Buether, Florian
    Schaefers, Klaus P.
    IEEE TRANSACTIONS ON MEDICAL IMAGING, 2017, 36 (02) : 422 - 432
  • [38] Data-Driven Motion Compensation Techniques for Noncooperative ISAR Imaging
    Vehmas, Risto
    Jylha, Juha
    Vaila, Minna
    Vihonen, Juho
    Visa, Ari
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2018, 54 (01) : 295 - 314
  • [39] Data-driven motion compensation of [18F]FDG-PET brain imaging using conditional Generative Adversial Networks (cGANs)
    Iommi, D.
    Sundar, L. Shiyam
    Muzik, O.
    Chalampalakis, Z.
    Klebermass, E. M.
    Hienert, M.
    Rischka, L.
    Lanzenberger, R.
    Hahn, A.
    Pataraia, E.
    Traub-Weidinger, T.
    Beyer, T.
    EUROPEAN JOURNAL OF NUCLEAR MEDICINE AND MOLECULAR IMAGING, 2020, 47 (SUPPL 1) : S484 - S485
  • [40] Data-driven Respiratory Motion Estimation and Correction Using TOF PET List-Mode Centroid of Distribution
    Ren, Silin
    Jin, Xiao
    Chan, Chung
    Jian, Yiqiang
    Mulnix, Tim
    Liu, Chi
    Carson, Richard E.
    2014 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (NSS/MIC), 2014,