Efficient intensity-based camera pose estimation in presence of depth

被引:0
|
作者
El Choubassi, Maha [1 ]
Nestares, Oscar [2 ]
Wu, Yi [2 ]
Kozintsev, Igor [2 ]
Haussecker, Horst [2 ]
机构
[1] Amer Univ Beirut, Dept Comp Sci, Beirut, Lebanon
[2] Intel Labs, Santa Clara, CA USA
关键词
Kinect; Depth; Intensity; Video Stabilization; 3D Camera Pose Estimation; Camera Tracking; Registration;
D O I
10.1117/12.2005729
中图分类号
O43 [光学];
学科分类号
070207 ; 0803 ;
摘要
The widespread success of Kinect enables users to acquire both image and depth information with satisfying accuracy at relatively low cost. We leverage the Kinect output to efficiently and accurately estimate the camera pose in presence of rotation, translation, or both. The applications of our algorithm are vast ranging from camera tracking, to 3D points clouds registration, and video stabilization. The state-of-the-art approach uses point correspondences for estimating the pose. More explicitly, it extracts point features from images, e. g., SURF or SIFT, and builds their descriptors, and matches features from different images to obtain point correspondences. However, while features-based approaches are widely used, they perform poorly in scenes lacking texture due to scarcity of features or in scenes with repetitive structure due to false correspondences. Our algorithm is intensity-based and requires neither point features' extraction, nor descriptors' generation/matching. Due to absence of depth, the intensity-based approach alone cannot handle camera translation. With Kinect capturing both image and depth frames, we extend the intensity-based algorithm to estimate the camera pose in case of both 3D rotation and translation. The results are quite promising.
引用
收藏
页数:8
相关论文
共 50 条
  • [21] Efficient Hand Pose Estimation from a Single Depth Image
    Xu, Chi
    Cheng, Li
    [J]. 2013 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2013, : 3456 - 3462
  • [22] Efficient Human Pose Estimation from Single Depth Images
    Shotton, Jamie
    Girshick, Ross
    Fitzgibbon, Andrew
    Sharp, Toby
    Cook, Mat
    Finocchio, Mark
    Moore, Richard
    Kohli, Pushmeet
    Criminisi, Antonio
    Kipman, Alex
    Blake, Andrew
    [J]. IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2013, 35 (12) : 2821 - 2840
  • [23] Camera Pose Free Depth Sensing Based on Focus Stacking
    Xue, Kai
    Liu, Yiguang
    Hong, Weijie
    Chang, Qing
    Miao, Wenjuan
    [J]. IMAGE AND GRAPHICS, ICIG 2019, PT III, 2019, 11903 : 181 - 192
  • [24] Unsupervised Learning of Depth Estimation and Camera Pose With Multi-Scale GANs
    Xu, Yufan
    Wang, Yan
    Huang, Rui
    Lei, Zeyu
    Yang, Junyao
    Li, Zijian
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (10) : 17039 - 17047
  • [25] Multi-body Depth and Camera Pose Estimation from Multiple Views
    Dal Cin, Andrea Porfiri
    Boracchi, Giacomo
    Magri, Luca
    [J]. 2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 17758 - 17768
  • [26] Spacecraft Pose Estimation Based on Different Camera Models
    Lidong Mo
    Naiming Qi
    Zhenqing Zhao
    [J]. Chinese Journal of Mechanical Engineering, 36
  • [27] Camera pose estimation based on structure from motion
    Alkhatib, M. N.
    Bobkov, A., V
    Zadoroznaya, N. M.
    [J]. 14TH INTERNATIONAL SYMPOSIUM INTELLIGENT SYSTEMS, 2021, 186 : 146 - 153
  • [28] Camera Pose Estimation using Human Head Pose Estimation
    Fischer, Robert
    Hoedlmoser, Michael
    Gelautz, Margrit
    [J]. PROCEEDINGS OF THE 17TH INTERNATIONAL JOINT CONFERENCE ON COMPUTER VISION, IMAGING AND COMPUTER GRAPHICS THEORY AND APPLICATIONS (VISAPP), VOL 4, 2022, : 877 - 886
  • [29] Spacecraft Pose Estimation Based on Different Camera Models
    Lidong Mo
    Naiming Qi
    Zhenqing Zhao
    [J]. Chinese Journal of Mechanical Engineering, 2023, (03) : 277 - 283
  • [30] Spacecraft Pose Estimation Based on Different Camera Models
    Mo, Lidong
    Qi, Naiming
    Zhao, Zhenqing
    [J]. CHINESE JOURNAL OF MECHANICAL ENGINEERING, 2023, 36 (01)