Lightweight hybrid visual-inertial odometry with closed-form zero velocity update

被引:0
|
作者
QIU Xiaochen [1 ,2 ]
ZHANG Hai [1 ,2 ]
FU Wenxing [3 ]
机构
[1] School of Automation Science and Electrical Engineering, Beihang University
[2] Science and Technology on Aircraft Control Laboratory
[3] Science and Technology on Complex System Control and Intelligent Agent Cooperation
关键词
D O I
暂无
中图分类号
TP212 [发送器(变换器)、传感器]; TN713 [滤波技术、滤波器];
学科分类号
080202 ;
摘要
Visual-Inertial Odometry(VIO) fuses measurements from camera and Inertial Measurement Unit(IMU) to achieve accumulative performance that is better than using individual sensors.Hybrid VIO is an extended Kalman filter-based solution which augments features with long tracking length into the state vector of Multi-State Constraint Kalman Filter(MSCKF). In this paper, a novel hybrid VIO is proposed, which focuses on utilizing low-cost sensors while also considering both the computational efficiency and positioning precision. The proposed algorithm introduces several novel contributions. Firstly, by deducing an analytical error transition equation, onedimensional inverse depth parametrization is utilized to parametrize the augmented feature state.This modification is shown to significantly improve the computational efficiency and numerical robustness, as a result achieving higher precision. Secondly, for better handling of the static scene,a novel closed-form Zero velocity UPda Te(ZUPT) method is proposed. ZUPT is modeled as a measurement update for the filter rather than forbidding propagation roughly, which has the advantage of correcting the overall state through correlation in the filter covariance matrix. Furthermore, online spatial and temporal calibration is also incorporated. Experiments are conducted on both public dataset and real data. The results demonstrate the effectiveness of the proposed solution by showing that its performance is better than the baseline and the state-of-the-art algorithms in terms of both efficiency and precision. A related software is open-sourced to benefit the community.①
引用
收藏
页码:3344 / 3359
页数:16
相关论文
共 50 条
  • [41] Using Vanishing Points to Improve Visual-Inertial Odometry
    Camposeco, Federico
    Pollefeys, Marc
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 5219 - 5225
  • [42] Visual-Inertial Odometry of Smartphone under Manhattan World
    Wang, YuAn
    Chen, Liang
    Wei, Peng
    Lu, XiangChen
    REMOTE SENSING, 2020, 12 (22) : 1 - 27
  • [43] Active Heading Planning for Improving Visual-Inertial Odometry
    Lee, Joohyuk
    Lee, Kyuman
    2024 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS, ICUAS, 2024, : 1085 - 1092
  • [44] Dense Visual-Inertial Odometry for Tracking of Aggressive Motions
    Ling, Yonggen
    Shen, Shaojie
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2015, : 576 - 583
  • [45] A Stereo-Based Visual-Inertial Odometry for SLAM
    Li, Yong
    Lang, ShiBing
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 594 - 598
  • [46] Renormalization for Initialization of Rolling Shutter Visual-Inertial Odometry
    Branislav Micusik
    Georgios Evangelidis
    International Journal of Computer Vision, 2021, 129 : 2011 - 2027
  • [47] Control-enabled Observability in Visual-Inertial Odometry
    Bai, He
    Taylor, Clark N.
    2017 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS'17), 2017, : 822 - 829
  • [48] Continuous-Time Spline Visual-Inertial Odometry
    Mo, Jiawei
    Sattar, Junaed
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9492 - 9498
  • [49] IMU Preintegration for Visual-Inertial Odometry Pose Estimation
    Liu, Fuchun
    Su, Xuan
    He, Yun
    Luo, Fei
    Gao, Huanli
    2018 37TH CHINESE CONTROL CONFERENCE (CCC), 2018, : 5305 - 5310
  • [50] The TUM VI Benchmark for Evaluating Visual-Inertial Odometry
    Schubert, David
    Goll, Thore
    Demmel, Nikolaus
    Usenko, Vladyslav
    Stueckler, Joerg
    Cremers, Daniel
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1680 - 1687