Dense Visual-Inertial Navigation System for Mobile Robots

被引:0
|
作者
Omari, Sammy [1 ,2 ]
Bloesch, Michael [1 ]
Gohl, Pascal [1 ,2 ]
Siegwart, Roland [1 ]
机构
[1] Swiss Fed Inst Technol, Autonomous Syst Lab, Zurich, Switzerland
[2] Skybotix AG, Zurich, Switzerland
关键词
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Real-time dense mapping and pose estimation is essential for a wide range of navigation tasks in mobile robotic applications. We propose an odometry and mapping system that leverages the full photometric information from a stereo-vision system as well as inertial measurements in a probabilistic framework while running in real-time on a single low-power Intel CPU core. Instead of performing mapping and localization on a set of sparse image features, we use the complete dense image intensity information in our navigation system. By incorporating a probabilistic model of the stereo sensor and the IMU, we can robustly estimate the ego-motion as well as a dense 3D model of the environment in real-time. The probabilistic formulation of the joint odometry estimation and mapping process enables to efficiently reject temporal outliers in ego-motion estimation as well as spatial outliers in the mapping process. To underline the versatility of the proposed navigation system, we evaluate it in a set of experiments on a multi-rotor system as well as on a quadrupedal walking robot. We tightly integrate our framework into the stabilization-loop of the UAV and the mapping framework of the walking robot. It is shown that the dense framework exhibits good tracking and mapping performance in terms of accuracy as well as robustness in scenarios with highly dynamic motion patterns while retaining a relatively small computational footprint. This makes it an ideal candidate for control and navigation tasks in unstructured GPS-denied environments, for a wide range of robotic platforms with power and weight constraints. The proposed framework is released as an open-source ROS package.
引用
收藏
页码:2634 / 2640
页数:7
相关论文
共 50 条
  • [1] A Survey of Visual-Inertial SLAM for Mobile Robots
    Shi, Junyi
    Zha, Fusheng
    Sun, Lining
    Guo, Wei
    Wang, Pengfei
    Li, Mantian
    [J]. Jiqiren/Robot, 2020, 42 (06): : 734 - 748
  • [2] Robust and Autonomous Stereo Visual-Inertial Navigation for Non-Holonomic Mobile Robots
    Chae, Hee-Won
    Choi, Ji-Hoon
    Song, Jae-Bok
    [J]. IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2020, 69 (09) : 9613 - 9623
  • [3] An Improved Monocular Visual-Inertial Navigation System
    Sun, Tian
    Liu, Yong
    Wang, Yujie
    Xiao, Zhen
    [J]. IEEE SENSORS JOURNAL, 2021, 21 (10) : 11728 - 11739
  • [4] SIMULATION FRAMEWORK FOR A VISUAL-INERTIAL NAVIGATION SYSTEM
    Irmisch, Patrick
    Baumbach, Dirk
    Ernst, Ines
    Boerner, Anko
    [J]. 2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1995 - 1999
  • [5] Visual-Inertial Navigation System Based on Virtual Inertial Sensors
    Cai, Yunpiao
    Qian, Weixing
    Zhao, Jiaqi
    Dong, Jiayi
    Shen, Tianxiao
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (12):
  • [6] MOBILE ROBOT NAVIGATION USING MONOCULAR VISUAL-INERTIAL FUSION
    Cai, Jianxian
    Gao, Penggang
    Wu, Yanxiong
    Gao, Zhitao
    [J]. MECHATRONIC SYSTEMS AND CONTROL, 2021, 49 (01): : 36 - 40
  • [7] Review of visual-inertial navigation system initialization method
    Liu, Zhe
    Shi, Dianxi
    Yang, Shaowu
    Li, Ruihao
    [J]. Guofang Keji Daxue Xuebao/Journal of National University of Defense Technology, 2023, 45 (02): : 15 - 26
  • [8] Towards Consistent Visual-Inertial Navigation
    Huang, Guoquan
    Kaess, Michael
    Leonard, John J.
    [J]. 2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 4926 - 4933
  • [9] Visual-Inertial Based Autonomous Navigation
    Martins, Francisco de Babo
    Teixeira, Luis F.
    Nobrega, Rui
    [J]. ROBOT 2015: SECOND IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2, 2016, 418 : 561 - 572
  • [10] Visual-Inertial Navigation: A Concise Review
    Huang, Guoquan
    [J]. 2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 9572 - 9582