High-resolution motion compensation for brain PET imaging using real-time electromagnetic motion tracking

被引:0
|
作者
Tan, Wanbin [1 ,2 ]
Wang, Zipai [1 ,2 ]
Zeng, Xinjie [1 ,3 ]
Boccia, Anthony [4 ]
Wang, Xiuyuan [1 ]
Li, Yixin [1 ,3 ]
Li, Yi [1 ]
Fung, Edward K. [1 ]
Qi, Jinyi [5 ]
Zeng, Tianyi [6 ]
Gupta, Ajay [7 ]
Goldan, Amir H. [1 ,8 ]
机构
[1] Cornell Univ, Weill Cornell Med Coll, Dept Radiol, New York, NY 14850 USA
[2] SUNY Stony Brook, Coll Engn & Appl Sci, Dept Biomed Engn, Stony Brook, NY USA
[3] SUNY Stony Brook, Coll Engn & Appl Sci, Dept Elect Engn, Stony Brook, NY USA
[4] SUNY Stony Brook, Renaissance Sch Med, Dept Radiol, Stony Brook, NY USA
[5] Univ Calif Davis, Dept Biomed Engn, Davis, CA USA
[6] Yale Univ, Dept Radiol & Biomed Imaging, New Haven, CT USA
[7] Columbia Univ, Irving Med Ctr, Dept Radiol, New York, NY USA
[8] Lasdon House,420 East 70th St, New York, NY 10021 USA
基金
美国国家卫生研究院;
关键词
event-by-event motion correction; electromagnetic motion tracking; Prism-PET; MOVEMENT; DEPTH; MODE;
D O I
10.1002/mp.17437
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
BackgroundSubstantial improvements in spatial resolution in brain positron emission tomography (PET) scanners have greatly reduced partial volume effect, making head movement the main source of image blur. To achieve high-resolution PET neuroimaging, precise real-time estimation of both head position and orientation is essential for accurate motion compensation.PurposeA high-resolution electromagnetic motion tracking (EMMT) system with an event-by-event motion correction is developed for PET-CT scanners.MethodsEMMT is comprised of a source, an array of sensors, and a readout electronic unit (REU). The source acts as a transmitter and emits an EM dipole field. It is placed in close proximity to the sensor array and detects changes in EM flux density due to sensor movement. The REU digitizes signals from each sensor and captures precise rotational and translational movements in real time. Tracked motion in the EMMT coordinate system is synchronized with the PET list-mode data and transformed into the scanner coordinate system by locating paired positions in both systems. The optimal rigid motion is estimated using singular value decomposition. The rigid motion and depth-of-interaction (DOI) parallax effect are corrected by event-by-event rebinning of mispositioned lines-of-response (LORs). We integrated the EMMT with our recently developed ultra-high resolution Prism-PET prototype brain scanner and a commercial Siemens Biograph mCT PET-CT scanner. We assessed the imaging performance of the Prism-PET/EMMT system using multi-frame motion of point sources and phantoms. The mCT/EMMT system was validated using a set of point sources attached to both a mannequin head and a human volunteer, for simulating multiframe and continuous motions, respectively. Additionally, a human subject for [18F]MK6240 PET imaging was included.ResultsThe tracking accuracy of the Prism-PET/EMMT system was quantified as a root-mean-square (RMS) error of 0.49 degrees$<^>{\circ }$ for 100 degrees$<^>{\circ }$ axial rotations, and an RMS error of 0.15 mm for 100 mm translations.The percent difference (%diff) in average full width at half maximum (FWHM) of point source between motion-corrected and static images, within a motion range of +/- 20 degrees$\pm 20<^>\circ$ and +/-$\pm$10 mm from the center of the scanner's field-of-view (FOV), was 3.9%. The measured recovery coefficients of the 2.5-mm diameter sphere in the activity-filled partial volume correction phantom were 23.9%, 70.8%, and 74.0% for the phantom with multi-frame motion, with motion and motion compensation, and without motion, respectively. In the mCT/EMMT system, the %diff in average FWHM of point sources between motion-corrected and static images, within a motion range of +/- 30 degrees$\pm 30<^>\circ$ and +/-$\pm$10 mm from the center of the FOV, was 14%. Applying motion correction to the [18F]MK6240 PET imaging reduced the motion-induced spill-in artifact in the lateral ventricle region, lowering its standardized uptake value ratio (SUVR) from 0.70 to 0.34.ConclusionsThe proposed EMMT system is a cost-effective, high frame-rate, and none-line-of-sight alternative to infrared camera-based tracking systems and is capable of achieving high rotational and translational tracking accuracies for mitigating motion-induced blur in high-resolution brain dedicated PET scanners.
引用
收藏
页码:201 / 218
页数:18
相关论文
共 50 条
  • [21] Real-time template based tracking with global motion compensation in UAv video
    Luzanov, Yuriy
    Howlett, Todd
    VISAPP 2007: PROCEEDINGS OF THE SECOND INTERNATIONAL CONFERENCE ON COMPUTER VISION THEORY AND APPLICATIONS, VOLUME IU/MTSV, 2007, : 515 - +
  • [22] Investigation of image motion compensation technique based on real-time LOS tracking
    Wang Yi-Kun
    Han Gui-Cheng
    Qi Hong-Xing
    Ma Yan-Peng
    Jiang Bo
    Liu Min
    Yao Bo
    Shu Rong
    JOURNAL OF INFRARED AND MILLIMETER WAVES, 2015, 34 (06) : 757 - 762
  • [23] Towards Real-time Surface Tracking and Motion Compensation Integration for Robotic Surgery
    Lindgren, Kyle
    Huang, Kevin
    Hannaford, Blake
    2017 IEEE/SICE INTERNATIONAL SYMPOSIUM ON SYSTEM INTEGRATION (SII), 2017,
  • [24] Real-Time Liver Motion Compensation for MRgFUS
    Ross, James C.
    Tranquebar, Rekha
    Shanbhag, Dattesh
    MEDICAL IMAGE COMPUTING AND COMPUTER-ASSISTED INTERVENTION - MICCAI 2008, PT II, PROCEEDINGS, 2008, 5242 : 806 - 813
  • [25] TECHNIQUE FOR REAL-TIME HIGH-RESOLUTION ADAPTIVE PHASE COMPENSATION
    FISHER, AD
    WARDE, C
    OPTICS LETTERS, 1983, 8 (07) : 353 - 355
  • [26] Real-time Awake Animal Motion Tracking System for SPECT Imaging
    Goddard, J. S.
    Baba, J. S.
    Lee, S. J.
    Weisenberger, A. G.
    Stolin, A.
    McKisson, J.
    Smith, M. F.
    2008 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (2008 NSS/MIC), VOLS 1-9, 2009, : 3980 - +
  • [27] Multiple Target Marker Tracking for Real-Time, Accurate, and Robust Rigid Body Motion Tracking of the Head for Brain PET
    Noonan, Philip J.
    Anton-Rodriguez, Jose M.
    Cootes, Tim F.
    Hallett, William A.
    Hinz, Rainer
    2013 IEEE NUCLEAR SCIENCE SYMPOSIUM AND MEDICAL IMAGING CONFERENCE (NSS/MIC), 2013,
  • [28] Learning Patterns of Motion Trajectories Using Real-time Tracking
    Cai Yingfeng
    Wang Hai
    Zhang Weigong
    MEMS, NANO AND SMART SYSTEMS, PTS 1-6, 2012, 403-408 : 2768 - 2771
  • [29] Real-Time Hand Gesture Recognition using Motion Tracking
    Chi-Man Pun
    Hong-Min Zhu
    Wei Feng
    International Journal of Computational Intelligence Systems, 2011, 4 (2) : 277 - 286
  • [30] Quadrotor Object Tracking using Real-Time Motion Sensing
    Mashood, Ahmed
    Dirir, Ahmed
    Hussein, Mousa
    Noura, Hassan
    Awwad, Falah
    2016 5TH INTERNATIONAL CONFERENCE ON ELECTRONIC DEVICES, SYSTEMS AND APPLICATIONS (ICEDSA), 2016,