VINS-MKF: A Tightly-Coupled Multi-Keyframe Visual-Inertial Odometry for Accurate and Robust State Estimation

被引:5
|
作者
Zhang, Chaofan [1 ,2 ]
Liu, Yong [1 ]
Wang, Fan [1 ,2 ]
Xia, Yingwei [1 ]
Zhang, Wen [1 ]
机构
[1] Chinese Acad Sci, Inst Appl Technol, Hefei Inst Phys Sci, Hefei 230031, Anhui, Peoples R China
[2] Univ Sci & Technol China, Grad Sch, Sci Isl Branch, Hefei 230026, Anhui, Peoples R China
关键词
state estimation; visual odometry; visual inertial fusion; multiple fisheye cameras; tightly coupled; MOTION; SLAM; NAVIGATION; VERSATILE;
D O I
10.3390/s18114036
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
State estimation is crucial for robot autonomy, visual odometry (VO) has received significant attention in the robotics field because it can provide accurate state estimation. However, the accuracy and robustness of most existing VO methods are degraded in complex conditions, due to the limited field of view (FOV) of the utilized camera. In this paper, we present a novel tightly-coupled multi-keyframe visual-inertial odometry (called VINS-MKF), which can provide an accurate and robust state estimation for robots in an indoor environment. We first modify the monocular ORBSLAM (Oriented FAST and Rotated BRIEF Simultaneous Localization and Mapping) to multiple fisheye cameras alongside an inertial measurement unit (IMU) to provide large FOV visual-inertial information. Then, a novel VO framework is proposed to ensure the efficiency of state estimation, by adopting a GPU (Graphics Processing Unit) based feature extraction method and parallelizing the feature extraction thread that is separated from the tracking thread with the mapping thread. Finally, a nonlinear optimization method is formulated for accurate state estimation, which is characterized as being multi-keyframe, tightly-coupled and visual-inertial. In addition, accurate initialization and a novel MultiCol-IMU camera model are coupled to further improve the performance of VINS-MKF. To the best of our knowledge, it's the first tightly-coupled multi-keyframe visual-inertial odometry that joins measurements from multiple fisheye cameras and IMU. The performance of the VINS-MKF was validated by extensive experiments using home-made datasets, and it showed improved accuracy and robustness over the state-of-art VINS-Mono.
引用
收藏
页数:28
相关论文
共 50 条
  • [21] VINS-Mono: A Robust and Versatile Monocular Visual-Inertial State Estimator
    Qin, Tong
    Li, Peiliang
    Shen, Shaojie
    IEEE TRANSACTIONS ON ROBOTICS, 2018, 34 (04) : 1004 - 1020
  • [22] Tightly-Coupled Model Aided Visual-Inertial Fusion for Quadrotor Micro Air Vehicles
    Abeywardena, Dinuka
    Dissanayake, Gamini
    FIELD AND SERVICE ROBOTICS, 2015, 105 : 153 - 166
  • [23] Tightly coupled visual-inertial fusion with image enhancement for robust positioning
    Fu, Zhumu
    Shi, Yongzhe
    Si, Pengju
    Gao, Song
    Yang, Yi
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (09)
  • [24] Multi-State Tightly-Coupled EKF-Based Radar-Inertial Odometry With Persistent Landmarks
    Michalczyk, Jan
    Jung, Roland
    Brommer, Christian
    Weiss, Stephan
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 4011 - 4017
  • [25] Kernel-VIO : An Optimization-based Tightly Coupled Indirect Visual-Inertial Odometry
    Lin, Shuyue
    Zhang, Xuetao
    Liu, Yisha
    Chen, Yuqing
    Zhuang, Yan
    2021 27TH INTERNATIONAL CONFERENCE ON MECHATRONICS AND MACHINE VISION IN PRACTICE (M2VIP), 2021,
  • [26] LIO-GVM: An Accurate, Tightly-Coupled Lidar-Inertial Odometry With Gaussian Voxel Map
    Ji, Xingyu
    Yuan, Shenghai
    Yin, Pengyu
    Xie, Lihua
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2200 - 2207
  • [27] BodySLAM plus plus : Fast and Tightly-Coupled Visual-Inertial Camera and Human Motion Tracking
    Henning, Dorian F.
    Choi, Christopher
    Schaefer, Simon
    Leutenegger, Stefan
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, IROS, 2023, : 3781 - 3788
  • [28] LIMOT: A Tightly-Coupled System for LiDAR-Inertial Odometry and Multi-Object Tracking
    Zhu, Zhongyang
    Zhao, Junqiao
    Huang, Kai
    Tian, Xuebo
    Lin, Jiaye
    Ye, Chen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (07): : 6600 - 6607
  • [29] Fast and Robust LiDAR-Inertial Odometry by Tightly-Coupled Iterated Kalman Smoother and Robocentric Voxels
    Liu, Jun
    Zhang, Yunzhou
    Zhao, Xiaoyu
    He, Zhengnan
    Liu, Wei
    Lv, Xiangren
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (10) : 14486 - 14496
  • [30] LVIO-Fusion:Tightly-Coupled LiDAR-Visual-Inertial Odometry and Mapping in Degenerate Environments
    Zhang, Hongkai
    Du, Liang
    Bao, Sheng
    Yuan, Jianjun
    Ma, Shugen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04) : 3783 - 3790