A ZUPT Aided Initialization Procedure for Tightly-coupled Lidar Inertial Odometry based SLAM System

被引:2
|
作者
Gui, Linqiu [1 ]
Zeng, Chunnian [2 ]
Dauchert, Samuel [3 ]
Luo, Jie [2 ]
Wang, Xiaofeng [3 ]
机构
[1] Wuhan Univ Technol, Sch Informat Engn, Wuhan, Hubei, Peoples R China
[2] Wuhan Univ Technol, Sch Automat, Wuhan, Hubei, Peoples R China
[3] Univ South Carolina, Dept Elect Engn, Columbia, SC USA
关键词
LiDAR SLAM; Unmanned platform; Initialization procedure; Zero Velocity Update (ZUPT); ROBUST;
D O I
10.1007/s10846-023-01886-3
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Simultaneous localization and mapping (SLAM) is an important research topic in unmanned platforms. The SLAM method, coupled with the Inertial Measurement Unit (IMU), has recently received much attention due to its excellent performance. However, the tightly coupled LiDAR and IMU odometry (LIO) will be interfered with by the unknown initial state during system startup, which will lead to system startup failure in severe cases, especially in the case of low-cost sensors. Therefore, this paper presents a Zero Velocity Update (ZUPT) aided initialization procedure to estimate unknown initial states assuming that the vehicle always starts from stationary. The proposed initialization procedure includes two phases: static phase and dynamic phase, and this paper also proposes a Zero Velocity Detector to distinguish the two phases precisely. The proposed system is evaluated on the datasets gathered from the campus zone with low-cost sensors to evaluate the whole system. The results show that our approach can effectively improve robustness and partly improve precision when the system is bootstrapped.
引用
收藏
页数:16
相关论文
共 50 条
  • [21] A tightly-coupled method of lidar-inertial based on complementary filtering
    Liu, Jinyue
    Zheng, Jiashuo
    Jia, Xiaohui
    Li, Tiejun
    Zhang, Wenxue
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (10)
  • [22] Development of tightly coupled based lidar-visual-inertial odometry
    Kim K.-W.
    Jung T.-K.
    Seo S.-H.
    Jee G.-I.
    Journal of Institute of Control, Robotics and Systems, 2020, 26 (08) : 597 - 603
  • [23] Fast and Robust LiDAR-Inertial Odometry by Tightly-Coupled Iterated Kalman Smoother and Robocentric Voxels
    Liu, Jun
    Zhang, Yunzhou
    Zhao, Xiaoyu
    He, Zhengnan
    Liu, Wei
    Lv, Xiangren
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (10) : 14486 - 14496
  • [24] LIO-LOT: Tightly-Coupled Multi-Object Tracking and LiDAR-Inertial Odometry
    Li, Xingxing
    Yan, Zhuohao
    Feng, Shaoquan
    Xia, Chunxi
    Li, Shengyu
    Zhou, Yuxuan
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2025, 26 (01) : 742 - 756
  • [25] Efficient and Accurate Tightly-Coupled Visual-Lidar SLAM
    Chou, Chih-Chung
    Chou, Cheng-Fu
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (09) : 14509 - 14523
  • [26] LIO-GVM: An Accurate, Tightly-Coupled Lidar-Inertial Odometry With Gaussian Voxel Map
    Ji, Xingyu
    Yuan, Shenghai
    Yin, Pengyu
    Xie, Lihua
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (03) : 2200 - 2207
  • [27] LVIO-Fusion:Tightly-Coupled LiDAR-Visual-Inertial Odometry and Mapping in Degenerate Environments
    Zhang, Hongkai
    Du, Liang
    Bao, Sheng
    Yuan, Jianjun
    Ma, Shugen
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (04) : 3783 - 3790
  • [28] LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping
    Shan, Tixiao
    Englot, Brendan
    Ratti, Carlo
    Rus, Daniela
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 5692 - 5698
  • [29] RLI-SLAM: Fast Robust Ranging-LiDAR-Inertial Tightly-Coupled Localization and Mapping
    Xin, Rui
    Guo, Ningyan
    Ma, Xingyu
    Liu, Gang
    Feng, Zhiyong
    SENSORS, 2024, 24 (17)
  • [30] GNSS/LiDAR/IMU Fusion Odometry Based on Tightly-Coupled Nonlinear Observer in Orchard
    Sun, Na
    Qiu, Quan
    Li, Tao
    Ru, Mengfei
    Ji, Chao
    Feng, Qingchun
    Zhao, Chunjiang
    REMOTE SENSING, 2024, 16 (16)