Multi-sensor based real-time 6-DoF pose tracking for wearable augmented reality

被引:17
|
作者
Fang, Wei [1 ]
Zheng, Lianyu [1 ]
Wu, Xiangyong [2 ]
机构
[1] Beihang Univ, Sch Mech Engn & Automat, Xueyuan Rd 37, Beijing 100191, Peoples R China
[2] Tianjin Inst Surverying & Mapping, Changling Rd, Tianjin 300381, Peoples R China
关键词
Wearable augmented reality; Sensor-fusion; Markerless; Pose tracking; Scale estimation; ODOMETRY; VISION; SLAM; ORB;
D O I
10.1016/j.compind.2017.06.002
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
Wearable augmented reality (WAR) combines a live view of a real scene with computer-generated graphic on resource-limited platforms. One of the crucial technologies for WAR is a real-time 6-DoF pose tracking, facilitating registration of virtual components within in a real scene. Generally, artificial markers are typically applied to provide pose tracking for WAR applications. However, these marker based methods suffer from marker occlusions or large viewpoint changes. Thus, a multi-sensor based tracking approach is applied in this paper, and it can perform real-time 6-DoF pose tracking with real-time scale estimation for WAR on a consumer smartphone. By combining a wide-angle monocular camera and an inertial sensor, a more robust 6-DoF motion tracking is demonstrated with the mutual compensations of the heterogeneous sensors. Moreover, with the help of the depth sensor, the scale initialization of the monocular tracking is addressed, where the initial scale is propagated within the subsequent sensor-fusion process, alleviating the scale drift in traditional monocular tracking approaches. In addition, a sliding-window based Kalman filter framework is used to provide a low jitter pose tracking for WAR. Finally, experiments are carried out to demonstrate the feasibility and robustness of the proposed tracking method for WAR applications. (C) 2017 Elsevier B.V. All rights reserved.
引用
收藏
页码:91 / 103
页数:13
相关论文
共 50 条
  • [1] Multi-modal Force/Vision Sensor Fusion in 6-DOF Pose Tracking
    Alkkiomaki, Olli
    Kyrki, Ville
    Liu, Yong
    Handroos, Heikki
    Kalviainen, Heikki
    ICAR: 2009 14TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS, VOLS 1 AND 2, 2009, : 476 - +
  • [2] Robust and real-time pose tracking for augmented reality on mobile devices
    Yang, Xin
    Guo, Jiabin
    Xue, Tangli
    Cheng, Kwang-Ting
    MULTIMEDIA TOOLS AND APPLICATIONS, 2018, 77 (06) : 6607 - 6628
  • [3] Robust and real-time pose tracking for augmented reality on mobile devices
    Xin Yang
    Jiabin Guo
    Tangli Xue
    Kwang-Ting (Tim) Cheng
    Multimedia Tools and Applications, 2018, 77 : 6607 - 6628
  • [4] Deep Active Contours for Real-time 6-DoF Object Tracking
    Wang, Long
    Yan, Shen
    Zhen, Jianan
    Liu, Yu
    Zhang, Maojun
    Zhang, Guofeng
    Zhou, Xiaowei
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 13988 - 13998
  • [5] Multi-sensor fusion for real-time object tracking
    Verma, Sakshi
    Singh, Vishal K. K.
    MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (07) : 19563 - 19585
  • [6] Multi-sensor fusion for real-time object tracking
    Sakshi Verma
    Vishal K. Singh
    Multimedia Tools and Applications, 2024, 83 : 19563 - 19585
  • [7] Real-Time Vehicles Tracking Based on Mobile Multi-Sensor Fusion
    Plangi, Siim
    Hadachi, Amnir
    Lind, Artjom
    Bensrhair, Abdelaziz
    IEEE SENSORS JOURNAL, 2018, 18 (24) : 10077 - 10084
  • [8] Multi-Sensor Fusion Tracking Algorithm Based on Augmented Reality System
    Wang, Yujie
    IEEE SENSORS JOURNAL, 2021, 21 (22) : 25010 - 25017
  • [9] 6-DOF motion recording method based on multi-sensor information fusion
    Ni, Tao
    Ma, Zhaojian
    Zhang, Hongyan
    Xu, Peng
    Li, Xiaopeng
    Nongye Jixie Xuebao/Transactions of the Chinese Society for Agricultural Machinery, 2013, 44 (06): : 258 - 262
  • [10] Real-time Camera Pose Estimation Based on Planar Object Tracking for Augmented Reality Environment
    Lee, Ahr-Hyun
    Lee, Seok-Han
    Lee, Jae-Young
    Choi, Jong-Soo
    2012 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS (ICCE), 2012, : 516 - 517