Robust Object Pose Tracking for Augmented Reality Guidance and Teleoperation

被引:3
|
作者
Black, David [1 ]
Salcudean, Septimiu [1 ]
机构
[1] Univ British Columbia, Dept Elect & Comp Engn, Vancouver, BC V6T1Z4, Canada
关键词
Kalman filters; Target tracking; Sensors; Optical filters; Adaptive optics; Ultrasonic imaging; Optical sensors; Augmented reality; computer vision; human-computer interaction; Kalman filter; teleoperation; tracking;
D O I
10.1109/TIM.2024.3398108
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
For many augmented reality guidance, teleoperation, or human-robot interaction systems, accurate, fast, and robust six-degree-of-freedom (6-DOF) object pose tracking is essential. However, current solutions easily lose tracking when line-of-sight to markers is lost. In this article, we present a tracking system that matches or improves on current methods in speed and accuracy, achieving 1.77 mm and 1.51 degrees accuracy at 22 Hz, and is robust to occlusions. Reflective markers are segmented in infrared (IR) images and used for pose computation using novel voting-based point correspondence algorithms and intelligent cropping. In addition, we introduce a new square-root unscented Kalman filter (UKF), which improves accuracy and flexibility over previous approaches by tracking the markers themselves rather than the computed pose and enabling fusion of an external inertial measurement unit (IMU). This reduces noise and makes the tracking robust to brief loss of line-of-sight. The algorithms and methods are described in detail with pseudocode, tested, and analyzed. The system is implemented in simulation and on a Microsoft HoloLens 2 using Unity for ease of integration into graphical projects. The code is made available open source. Through the improvements in speed and robustness over previous methods, this solution has the potential to enable fast and reliable pose tracking for many mixed reality (MR) and teleoperation applications.
引用
收藏
页码:1 / 15
页数:15
相关论文
共 50 条
  • [1] Robust augmented reality tracking based visual pose estimation
    Maidi, Madjid
    Ababsa, Fakhr-Eddine
    Mallem, Malik
    ICINCO 2006: PROCEEDINGS OF THE THIRD INTERNATIONAL CONFERENCE ON INFORMATICS IN CONTROL, AUTOMATION AND ROBOTICS: ROBOTICS AND AUTOMATION, 2006, : 346 - 351
  • [2] Markerless pose tracking for augmented reality
    Yuan, Chunrong
    ADVANCES IN VISUAL COMPUTING, PT 1, 2006, 4291 : 721 - 730
  • [3] Robust camera pose tracking for augmented reality using particle filtering framework
    Ababsa, Fakhreddine
    Mallem, Malik
    MACHINE VISION AND APPLICATIONS, 2011, 22 (01) : 181 - 195
  • [4] Robust camera pose tracking for augmented reality using particle filtering framework
    Fakhreddine Ababsa
    Malik Mallem
    Machine Vision and Applications, 2011, 22 : 181 - 195
  • [5] Robust and real-time pose tracking for augmented reality on mobile devices
    Yang, Xin
    Guo, Jiabin
    Xue, Tangli
    Cheng, Kwang-Ting
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (06) : 6607 - 6628
  • [6] Robust and real-time pose tracking for augmented reality on mobile devices
    Xin Yang
    Jiabin Guo
    Tangli Xue
    Kwang-Ting (Tim) Cheng
    Multimedia Tools and Applications, 2018, 77 : 6607 - 6628
  • [7] Robust Tracking for Augmented Reality
    Gonzalez-Linares, Jose M.
    Guil, Nicolas
    Ramos Cozar, Julian
    ADVANCES IN COMPUTATIONAL INTELLIGENCE, PT I (IWANN 2015), 2015, 9094 : 301 - 308
  • [8] A fast and robust simultaneous pose tracking and structure recovery algorithm for augmented reality applications
    Yu, YK
    Wong, KH
    Chang, MMY
    ICIP: 2004 INTERNATIONAL CONFERENCE ON IMAGE PROCESSING, VOLS 1- 5, 2004, : 1029 - 1032
  • [9] IEKF based object pose estimation for Augmented Reality
    Song, Jiaru
    Hu, Shiqiang
    Yang, Yongsheng
    PROCEEDINGS OF THE 4TH INTERNATIONAL CONFERENCE ON VIRTUAL REALITY (ICVR 2018), 2018, : 15 - 20
  • [10] Teleoperation and augmented reality
    Ernadotte, D
    Fuchs, P
    Laurgeau, C
    8TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS, 1997 PROCEEDINGS - ICAR'97, 1997, : 771 - 777