A Deblurring Method for Indirect Time-of-Flight Depth Sensor

被引:3
|
作者
Gao, Jing [1 ,2 ]
Gao, Xueqiang [1 ,2 ]
Nie, Kaiming [1 ,2 ]
Gao, Zhiyuan [1 ,2 ]
Xu, Jiangtao [1 ,2 ]
机构
[1] Tianjin Univ, Sch Microelect, Tianjin 300072, Peoples R China
[2] Tianjin Univ, Tianjin Key Lab Imaging & Sensing Microelect Techn, Tianjin 300072, Peoples R China
关键词
Cameras; Sensor phenomena and characterization; Imaging; Image sensors; Time measurement; Three-dimensional displays; Phase measurement; Charge correction; deblurring method; indirect time-of-flight (iToF) depth sensor; motion blur;
D O I
10.1109/JSEN.2022.3229687
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Indirect time-of-flight (iToF) depth sensors can obtain depth from the phase offset between emitted and received modulated infrared (IR) pulsed light. However, these sensors suffer from motion blur artifacts when there are moving objects in the scene causing depth measurement distortion. By analyzing the mechanism of motion blur, this article summarizes motion blur into categories: half-frame motion blur and full-frame motion blur. For both blur categories, this article, respectively, proposes the deblurring method based on charge correction with the adjacent frame charge reference and the neighborhood similar charge reference. Based on the range resolution equation, the proposed pixel-based blur detection method can adaptively detect motion blur. For detected blurred pixels, the proposed deblurring method removes blur by recalculating the depth of blurred pixels. Motion blur was suppressed at 30 frames/s owing to the proposed deblurring method. The depth error is about 1.68% over the range of 1-2 m, with a modulation frequency of 40 MHz. Experimental results demonstrate that the proposed deblurring method can effectively eliminate the motion blur of moving objects with minimal computational cost.
引用
收藏
页码:2718 / 2726
页数:9
相关论文
共 50 条
  • [21] Time-of-flight depth camera accuracy enhancement
    Lee, Seungkyu
    OPTICAL ENGINEERING, 2012, 51 (08)
  • [22] TRITIUM DEPTH PROFILING BY NEUTRON TIME-OF-FLIGHT
    DAVIS, JC
    ANDERSON, JD
    JOURNAL OF VACUUM SCIENCE & TECHNOLOGY, 1975, 12 (01): : 358 - 360
  • [23] RANGE UNFOLDING FOR TIME-OF-FLIGHT DEPTH CAMERAS
    Choi, Ouk
    Lim, Hwasup
    Kang, Byongmin
    Kim, Yong Sun
    2010 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, 2010, : 4189 - 4192
  • [24] Eye Contact in Video Conference via Fusion of Time-of-Flight Depth Sensor and Stereo
    Zhu, Jiejie
    Yang, Ruigang
    Xiang, Xueqing
    3D RESEARCH, 2011, 2 (03): : 1 - 10
  • [25] Joint Depth and Alpha Matte Optimization via Fusion of Stereo and Time-of-Flight Sensor
    Zhu, Jiejie
    Liao, Miao
    Yang, Ruigang
    Pan, Zhigeng
    CVPR: 2009 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOLS 1-4, 2009, : 453 - +
  • [26] New Biometrics-Acquisition Method using Time-of-Flight Depth Camera
    Kim, Tae-Chan
    Kyung, Kyu-Min
    Bae, Kwanghyuk
    IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE 2011), 2011, : 721 - 722
  • [27] Dead Time Effects in the Indirect Time-of-Flight Measurement with SPADs
    Beer, Maik
    Schrey, Olaf
    Hosticka, Bedrich J.
    Kokozinski, Rainer
    2017 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS (ISCAS), 2017, : 505 - 508
  • [28] Reflectance Measurement Method Based on Sensor Fusion of Frame-Based Hyperspectral Imager and Time-of-Flight Depth Camera
    Rahkonen, Samuli
    Lind, Leevi
    Raita-Hakola, Anna-Maria
    Kiiskinen, Sampsa
    Polonen, Ilkka
    SENSORS, 2022, 22 (22)
  • [29] Real-time 3D head traclang based on time-of-flight depth sensor
    Parvizi, Ehsan
    Wu, Q. M. Jonathan
    19TH IEEE INTERNATIONAL CONFERENCE ON TOOLS WITH ARTIFICIAL INTELLIGENCE, VOL I, PROCEEDINGS, 2007, : 517 - 521
  • [30] CMOS SPAD Pixels for Indirect Time-of-Flight Ranging
    Bronzi, D.
    Villa, F.
    Bellisai, S.
    Markovic, B.
    Boso, G.
    Scarcella, C.
    Della Frera, A.
    Tosi, A.
    2012 IEEE PHOTONICS CONFERENCE (IPC), 2012, : 22 - 23