High-resolution motion compensation for brain PET imaging using real-time electromagnetic motion tracking

被引:0
|
作者
Tan, Wanbin [1 ,2 ]
Wang, Zipai [1 ,2 ]
Zeng, Xinjie [1 ,3 ]
Boccia, Anthony [4 ]
Wang, Xiuyuan [1 ]
Li, Yixin [1 ,3 ]
Li, Yi [1 ]
Fung, Edward K. [1 ]
Qi, Jinyi [5 ]
Zeng, Tianyi [6 ]
Gupta, Ajay [7 ]
Goldan, Amir H. [1 ,8 ]
机构
[1] Cornell Univ, Weill Cornell Med Coll, Dept Radiol, New York, NY 14850 USA
[2] SUNY Stony Brook, Coll Engn & Appl Sci, Dept Biomed Engn, Stony Brook, NY USA
[3] SUNY Stony Brook, Coll Engn & Appl Sci, Dept Elect Engn, Stony Brook, NY USA
[4] SUNY Stony Brook, Renaissance Sch Med, Dept Radiol, Stony Brook, NY USA
[5] Univ Calif Davis, Dept Biomed Engn, Davis, CA USA
[6] Yale Univ, Dept Radiol & Biomed Imaging, New Haven, CT USA
[7] Columbia Univ, Irving Med Ctr, Dept Radiol, New York, NY USA
[8] Lasdon House,420 East 70th St, New York, NY 10021 USA
基金
美国国家卫生研究院;
关键词
event-by-event motion correction; electromagnetic motion tracking; Prism-PET; MOVEMENT; DEPTH; MODE;
D O I
10.1002/mp.17437
中图分类号
R8 [特种医学]; R445 [影像诊断学];
学科分类号
1002 ; 100207 ; 1009 ;
摘要
BackgroundSubstantial improvements in spatial resolution in brain positron emission tomography (PET) scanners have greatly reduced partial volume effect, making head movement the main source of image blur. To achieve high-resolution PET neuroimaging, precise real-time estimation of both head position and orientation is essential for accurate motion compensation.PurposeA high-resolution electromagnetic motion tracking (EMMT) system with an event-by-event motion correction is developed for PET-CT scanners.MethodsEMMT is comprised of a source, an array of sensors, and a readout electronic unit (REU). The source acts as a transmitter and emits an EM dipole field. It is placed in close proximity to the sensor array and detects changes in EM flux density due to sensor movement. The REU digitizes signals from each sensor and captures precise rotational and translational movements in real time. Tracked motion in the EMMT coordinate system is synchronized with the PET list-mode data and transformed into the scanner coordinate system by locating paired positions in both systems. The optimal rigid motion is estimated using singular value decomposition. The rigid motion and depth-of-interaction (DOI) parallax effect are corrected by event-by-event rebinning of mispositioned lines-of-response (LORs). We integrated the EMMT with our recently developed ultra-high resolution Prism-PET prototype brain scanner and a commercial Siemens Biograph mCT PET-CT scanner. We assessed the imaging performance of the Prism-PET/EMMT system using multi-frame motion of point sources and phantoms. The mCT/EMMT system was validated using a set of point sources attached to both a mannequin head and a human volunteer, for simulating multiframe and continuous motions, respectively. Additionally, a human subject for [18F]MK6240 PET imaging was included.ResultsThe tracking accuracy of the Prism-PET/EMMT system was quantified as a root-mean-square (RMS) error of 0.49 degrees$<^>{\circ }$ for 100 degrees$<^>{\circ }$ axial rotations, and an RMS error of 0.15 mm for 100 mm translations.The percent difference (%diff) in average full width at half maximum (FWHM) of point source between motion-corrected and static images, within a motion range of +/- 20 degrees$\pm 20<^>\circ$ and +/-$\pm$10 mm from the center of the scanner's field-of-view (FOV), was 3.9%. The measured recovery coefficients of the 2.5-mm diameter sphere in the activity-filled partial volume correction phantom were 23.9%, 70.8%, and 74.0% for the phantom with multi-frame motion, with motion and motion compensation, and without motion, respectively. In the mCT/EMMT system, the %diff in average FWHM of point sources between motion-corrected and static images, within a motion range of +/- 30 degrees$\pm 30<^>\circ$ and +/-$\pm$10 mm from the center of the FOV, was 14%. Applying motion correction to the [18F]MK6240 PET imaging reduced the motion-induced spill-in artifact in the lateral ventricle region, lowering its standardized uptake value ratio (SUVR) from 0.70 to 0.34.ConclusionsThe proposed EMMT system is a cost-effective, high frame-rate, and none-line-of-sight alternative to infrared camera-based tracking systems and is capable of achieving high rotational and translational tracking accuracies for mitigating motion-induced blur in high-resolution brain dedicated PET scanners.
引用
收藏
页码:201 / 218
页数:18
相关论文
共 50 条
  • [1] Real-Time Motion Correction for High-Resolution Larynx Imaging
    Barral, Joelle K.
    Santos, Juan M.
    Damrose, Edward J.
    Fischbein, Nancy J.
    Nishimura, Dwight G.
    MAGNETIC RESONANCE IN MEDICINE, 2011, 66 (01) : 174 - 179
  • [2] Real-time eye motion compensation for OCT imaging with tracking SLO
    Vienola, Kari V.
    Braaf, Boy
    Sheehy, Christy K.
    Yang, Qiang
    Tiruveedhula, Pavan
    Arathorn, David W.
    de Boer, Johannes F.
    Roorda, Austin
    BIOMEDICAL OPTICS EXPRESS, 2012, 3 (11): : 2950 - 2963
  • [3] Anchor-Based, Real-Time Motion Compensation for High-Resolution mmWave Radar
    Poole, Nikhil
    Arbabian, Amin
    IEEE JOURNAL OF MICROWAVES, 2024, 4 (03): : 440 - 458
  • [4] Observations on real-time prostate gland motion using electromagnetic tracking
    Langen, Katja M.
    Willoughby, Twyla R.
    Meeks, Sanford L.
    Santhanam, Anand
    Cunningham, Alexis
    Levine, Lisa
    Kupelian, Patrick A.
    INTERNATIONAL JOURNAL OF RADIATION ONCOLOGY BIOLOGY PHYSICS, 2008, 71 (04): : 1084 - 1090
  • [5] Head motion of anesthetized patients during high-resolution PET brain imaging
    Conant, S.
    Thada, S.
    Barker, W.
    JOURNAL OF NUCLEAR MEDICINE, 2010, 51
  • [6] Implementation and performance of an optical motion tracking system for high resolution brain PET imaging
    Lopresti, BJ
    Russo, A
    Jones, WF
    Fisher, T
    Crouch, DG
    Altenburger, DE
    Townsend, DW
    IEEE TRANSACTIONS ON NUCLEAR SCIENCE, 1999, 46 (06) : 2059 - 2067
  • [7] Integrated camera motion compensation by real-time image motion tracking and image deconvolution
    Janschek, K
    Tchernykh, V
    Dyblenko, S
    2005 IEEE/ASME INTERNATIONAL CONFERENCE ON ADVANCED INTELLIGENT MECHATRONICS, VOLS 1 AND 2, 2005, : 1437 - 1444
  • [8] Real-time 3D motion tracking for small animal brain PET
    Kyme, A. Z.
    Zhou, V. W.
    Meikle, S. R.
    Fulton, R. R.
    PHYSICS IN MEDICINE AND BIOLOGY, 2008, 53 (10): : 2651 - 2666
  • [9] A Novel Motion Compensation Method for High-Resolution ISAR Imaging
    Chen, Juan
    Yuan, Yunneng
    Luan, Jun
    PROCEEDINGS OF 2012 IEEE 11TH INTERNATIONAL CONFERENCE ON SIGNAL PROCESSING (ICSP) VOLS 1-3, 2012, : 1866 - 1869
  • [10] Real-Time Motion Compensation Using Optical Flow
    Benes, Radek
    Riha, Kamil
    TSP 2010: 33RD INTERNATIONAL CONFERENCE ON TELECOMMUNICATIONS AND SIGNAL PROCESSING, 2010, : 166 - 170