Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality

被引:4
|
作者
Jinyu L. [1 ]
Bangbang Y. [1 ]
Danpeng C. [1 ]
Nan W. [2 ]
Guofeng Z. [1 ]
Hujun B. [1 ]
机构
[1] State Key Lab of CAD&CG, Zhejiang University, Hangzhou
[2] SenseTime Research, Hangzhou
来源
基金
中国国家自然科学基金;
关键词
Augmented reality; Localization; Mapping; Odometry; Tracking; Visual-inertial SLAM;
D O I
10.1016/j.vrih.2019.07.002
中图分类号
学科分类号
摘要
Although VSLAM/VISLAM has achieved great success, it is still difficult to quantitatively evaluate the localization results of different kinds of SLAM systems from the aspect of augmented reality due to the lack of an appropriate benchmark. For AR applications in practice, a variety of challenging situations (e.g., fast motion, strong rotation, serious motion blur, dynamic interference) may be easily encountered since a home user may not carefully move the AR device, and the real environment may be quite complex. In addition, the frequency of camera lost should be minimized and the recovery from the failure status should be fast and accurate for good AR experience. Existing SLAM datasets/benchmarks generally only provide the evaluation of pose accuracy and their camera motions are somehow simple and do not fit well the common cases in the mobile AR applications. With the above motivation, we build a new visual-inertial dataset as well as a series of evaluation criteria for AR. We also review the existing monocular VSLAM/VISLAM approaches with detailed analyses and comparisons. Especially, we select 8 representative monocular VSLAM/VISLAM approaches/systems and quantitatively evaluate them on our benchmark. Our dataset, sample code and corresponding evaluation tools are available at the benchmark website http://www.zjucvg.net/eval-vislam/. © 2019 Beijing Zhongke Journal Publishing Co. Ltd
引用
收藏
页码:386 / 410
页数:24
相关论文
共 50 条
  • [1] Evaluation of Monocular Visual-Inertial SLAM: Benchmark and Experiment
    Haddadi, Seyed Jamal
    Castelan, Eugenio B.
    2019 7TH INTERNATIONAL CONFERENCE ON ROBOTICS AND MECHATRONICS (ICROM 2019), 2019, : 599 - 606
  • [2] A novel visual-inertial Monocular SLAM
    Yue, Xiaofeng
    Zhang, Wenjuan
    Xu, Li
    Liu, JiangGuo
    MIPPR 2017: AUTOMATIC TARGET RECOGNITION AND NAVIGATION, 2018, 10608
  • [3] Monocular Visual-Inertial State Estimation for Mobile Augmented Reality
    Li, Peiliang
    Qin, Tong
    Hu, Botao
    Zhu, Fengyuan
    Shen, Shaojie
    PROCEEDINGS OF THE 2017 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR), 2017, : 11 - 21
  • [4] Robust Collaborative Visual-Inertial SLAM for Mobile Augmented Reality
    Pan, Xiaokun
    Huang, Gan
    Zhang, Ziyang
    Li, Jinyu
    Bao, Hujun
    Zhang, Guofeng
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2024, 30 (11) : 7354 - 7363
  • [5] Adaptive Monocular Visual-Inertial SLAM for Real-Time Augmented Reality Applications in Mobile Devices
    Piao, Jin-Chun
    Kim, Shin-Dug
    SENSORS, 2017, 17 (11)
  • [6] Stationary Detector for Monocular Visual-Inertial SLAM
    Guillemard, Richard
    Helenon, Francois
    Petit, Bruno
    Gay-Bellile, Vincent
    Carrier, Mathieu
    2019 INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2019,
  • [7] Visual-Inertial Monocular SLAM With Map Reuse
    Mur-Artal, Raul
    Tardos, Juan D.
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2017, 2 (02): : 796 - 803
  • [8] Visual-Inertial RGB-D SLAM for Mobile Augmented Reality
    Williem
    Ivan, Andre
    Seok, Hochang
    Lim, Jongwoo
    Yoon, Kuk-Jin
    Cho, Ikhwan
    Park, In Kyu
    ADVANCES IN MULTIMEDIA INFORMATION PROCESSING - PCM 2017, PT II, 2018, 10736 : 928 - 938
  • [9] Experimental Evaluation of Monocular Visual-Inertial SLAM Methods for Freight Railways
    Li, Ruoying
    Lou, Yidong
    Song, Weiwei
    Wang, Yusheng
    Tu, Zhiyong
    IEEE SENSORS JOURNAL, 2023, 23 (19) : 23282 - 23293
  • [10] An Improved Initialization Method for Monocular Visual-Inertial SLAM
    Cheng, Jun
    Zhang, Liyan
    Chen, Qihong
    ELECTRONICS, 2021, 10 (24)