Survey and evaluation of monocular visual-inertial SLAM algorithms for augmented reality

被引:4
|
作者
Jinyu L. [1 ]
Bangbang Y. [1 ]
Danpeng C. [1 ]
Nan W. [2 ]
Guofeng Z. [1 ]
Hujun B. [1 ]
机构
[1] State Key Lab of CAD&CG, Zhejiang University, Hangzhou
[2] SenseTime Research, Hangzhou
来源
基金
中国国家自然科学基金;
关键词
Augmented reality; Localization; Mapping; Odometry; Tracking; Visual-inertial SLAM;
D O I
10.1016/j.vrih.2019.07.002
中图分类号
学科分类号
摘要
Although VSLAM/VISLAM has achieved great success, it is still difficult to quantitatively evaluate the localization results of different kinds of SLAM systems from the aspect of augmented reality due to the lack of an appropriate benchmark. For AR applications in practice, a variety of challenging situations (e.g., fast motion, strong rotation, serious motion blur, dynamic interference) may be easily encountered since a home user may not carefully move the AR device, and the real environment may be quite complex. In addition, the frequency of camera lost should be minimized and the recovery from the failure status should be fast and accurate for good AR experience. Existing SLAM datasets/benchmarks generally only provide the evaluation of pose accuracy and their camera motions are somehow simple and do not fit well the common cases in the mobile AR applications. With the above motivation, we build a new visual-inertial dataset as well as a series of evaluation criteria for AR. We also review the existing monocular VSLAM/VISLAM approaches with detailed analyses and comparisons. Especially, we select 8 representative monocular VSLAM/VISLAM approaches/systems and quantitatively evaluate them on our benchmark. Our dataset, sample code and corresponding evaluation tools are available at the benchmark website http://www.zjucvg.net/eval-vislam/. © 2019 Beijing Zhongke Journal Publishing Co. Ltd
引用
收藏
页码:386 / 410
页数:24
相关论文
共 50 条
  • [21] Visual-Inertial Curve SLAM
    Meier, Kevin
    Chung, Soon-Jo
    Hutchinson, Seth
    2016 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2016), 2016, : 1238 - 1245
  • [22] Semi-direct monocular visual and visual-inertial SLAM with loop closure detection
    Li, Shao-Peng
    Zhang, Tao
    Gao, Xiang
    Wang, Duo
    Xian, Yong
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2019, 112 : 201 - 210
  • [23] Monocular Visual SLAM for Markerless Tracking Algorithm to Augmented Reality
    Yang, Tingting
    Jia, Shuwen
    Yu, Ying
    Sui, Zhiyong
    INTELLIGENT AUTOMATION AND SOFT COMPUTING, 2023, 35 (02): : 1691 - 1704
  • [24] Monocular Visual-Inertial SLAM With IMU-Aided Hybrid Line Matching
    Zha, Gongpu
    Guan, Peiyu
    Cao, Zhiqiang
    Sun, Ting
    Yu, Shijie
    IEEE SENSORS LETTERS, 2024, 8 (09)
  • [25] A Benchmark Comparison of Monocular Visual-Inertial Odometry Algorithms for Flying Robots
    Delmerico, Jeffrey
    Scaramuzza, Davide
    2018 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2018, : 2502 - 2509
  • [26] A mapping of visual SLAM algorithms and their applications in augmented reality
    Mucheroni Covolan, Joao Pedro
    Sementille, Antonio Carlos
    Rodrigues Sanches, Silvio Ricardo
    2020 22ND SYMPOSIUM ON VIRTUAL AND AUGMENTED REALITY (SVR 2020), 2020, : 20 - 29
  • [27] Monocular Visual-Inertial Depth Estimation
    Wofk, Diana
    Ranftl, Rene
    Muller, Matthias
    Koltun, Vladlen
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 6095 - 6101
  • [28] CVI-SLAM-Collaborative Visual-Inertial SLAM
    Karrer, Marco
    Schmuck, Patrik
    Chli, Margarita
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04): : 2762 - 2769
  • [29] An optical flow-line feature based monocular visual-inertial SLAM algorithm
    Xia L.
    Shen R.
    Chi D.
    Cui J.
    Meng Y.
    Zhongguo Guanxing Jishu Xuebao/Journal of Chinese Inertial Technology, 2020, 28 (05): : 568 - 575
  • [30] Environment Driven Underwater Camera-IMU Calibration for Monocular Visual-Inertial SLAM
    Gu, Changjun
    Cong, Yang
    Sun, Gan
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 2405 - 2411