Towards Robust Visual-Inertial Odometry with Multiple Non-Overlapping Monocular Cameras

被引:3
|
作者
He, Yao [1 ,2 ,3 ]
Yu, Huai [3 ,4 ]
Yang, Wen [3 ]
Scherer, Sebastian [4 ]
机构
[1] Chinese Univ Hongkong, Sch Sci & Engn, Shenzhen, Peoples R China
[2] Shenzhen Inst Artificial Intelligence & Robot Soc, Shenzhen, Peoples R China
[3] Wuhan Univ, Elect Informat Sch, Wuhan 430072, Peoples R China
[4] Carnegie Mellon Univ, AirLab, Pittsburgh, PA 15213 USA
基金
美国安德鲁·梅隆基金会;
关键词
VERSATILE; SLAM;
D O I
10.1109/IROS47612.2022.9981664
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
We present a Visual-Inertial Odometry (VIO) algorithm with multiple non-overlapping monocular cameras aiming at improving the robustness of the VIO algorithm. An initialization scheme and tightly-coupled bundle adjustment for multiple non-overlapping monocular cameras are proposed. With more stable features captured by multiple cameras, VIO can maintain stable state estimation, especially when one of the cameras tracked unstable or limited features. We also address the high CPU usage rate brought by multiple cameras by proposing a GPU-accelerated frontend. Finally, we use our pedestrian carried system to evaluate the robustness of the VIO algorithm in several challenging environments. The results show that the multi-camera setup yields significantly higher estimation robustness than a monocular system while not increasing the CPU usage rate (reducing the CPU resource usage rate and computational latency by 40.4% and 50.6% on each camera). A demo video can be found at https://youtu.be/r7QvPth1m10.
引用
收藏
页码:9452 / 9458
页数:7
相关论文
共 50 条
  • [41] Information Sparsification in Visual-Inertial Odometry
    Hsiung, Jerry
    Hsiao, Ming
    Westman, Eric
    Valencia, Rafael
    Kaess, Michael
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1146 - 1153
  • [42] Robust Visual-Inertial Odometry Based on a Kalman Filter and Factor Graph
    Wang, Zhiwei
    Pang, Bao
    Song, Yong
    Yuan, Xianfeng
    Xu, Qingyang
    Li, Yibin
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (07) : 7048 - 7060
  • [43] Improving Visual-Inertial Odometry with Robust Outlier Rejection and Loop Closure
    Jia, Chenxin
    Cao, Ying
    Yang, Jian
    Rao, Yuan
    Fan, Hao
    Yao, Wenlin
    2020 11TH INTERNATIONAL CONFERENCE ON AWARENESS SCIENCE AND TECHNOLOGY (ICAST), 2020,
  • [44] Tracking of Multiple Objects Across Multiple Cameras with Overlapping and Non-Overlapping Views
    Zhu, LiangJia
    Hwang, Jenq-Neng
    Cheng, Hsu-Yung
    ISCAS: 2009 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOLS 1-5, 2009, : 1056 - +
  • [45] A nonlinear optimization-based monocular dense mapping system of visual-inertial odometry
    Fan, Chuanliu
    Hou, Junyi
    Yu, Lei
    MEASUREMENT, 2021, 180
  • [46] A Real-Time Visual-Inertial Monocular Odometry by Fusing Point and Line Features
    Li, Chengwei
    Yan, Liping
    Xia, Yuanqing
    2021 PROCEEDINGS OF THE 40TH CHINESE CONTROL CONFERENCE (CCC), 2021, : 4085 - 4090
  • [47] Appearance modeling for tracking in multiple non-overlapping cameras
    Javed, O
    Shafique, K
    Shah, M
    2005 IEEE COMPUTER SOCIETY CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, VOL 2, PROCEEDINGS, 2005, : 26 - 33
  • [48] DFF-VIO: A General Dynamic Feature Fused Monocular Visual-Inertial Odometry
    Luo, Nan
    Hu, Zhexuan
    Ding, Yuan
    Li, Jiaxu
    Zhao, Hui
    Liu, Gang
    Wang, Quan
    IEEE TRANSACTIONS ON CIRCUITS AND SYSTEMS FOR VIDEO TECHNOLOGY, 2025, 35 (02) : 1758 - 1773
  • [49] SelfVIO: Self-supervised deep monocular Visual-Inertial Odometry and depth estimation
    Almalioglu, Yasin
    Turan, Mehmet
    Saputra, Muhamad Risqi U.
    de Gusmao, Pedro P. B.
    Markham, Andrew
    Trigoni, Niki
    NEURAL NETWORKS, 2022, 150 : 119 - 136
  • [50] VIDO: A Robust and Consistent Monocular Visual-Inertial-Depth Odometry
    Gao, Yuanxi
    Yuan, Jing
    Jiang, Jingqi
    Sun, Qinxuan
    Zhang, Xuebo
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (03) : 2976 - 2992