Study on stereo visual inertial SLAM combining vehicle dynamics

被引:0
|
作者
Yu Z. [1 ,3 ]
Ju R. [1 ]
Han Y. [1 ]
Zhao J. [2 ]
机构
[1] School of Electronics and Information Engineering, Tongji University, Shanghai
[2] Nanchang Automotive Innovation Institute, Nanchang
关键词
multi-sensor fusion; pre-integration; simultaneous localization and mapping (SLAM); unmanned ground vehicle localization; vehicle dynamics;
D O I
10.13245/j.hust.220509
中图分类号
学科分类号
摘要
Aiming at the problem of low accuracy of unmanned vehicle visual simultaneous land and mapping (SLAM) in underground garage,a pre-integration method combining inertial measurement unit (IMU) gyroscope measurement and vehicle dynamics measurement was proposed. The rotation pre-integration was performed at IMU frequency,and the translation pre-integration was performed at vehicle dynamics measurement frequency. The gyroscope measurement was introduced into the calculation of translation pre-integration,so that it could express non planar motion.First,the pre-integration equation,Jacobian matrix and noise state transition equation were derived by Lie algebra and rotation group.Then,based on this pre-integration,the vehicle dynamics information was incorporated into the stereo visual inertial SLAM,so as to improve the accuracy.Experiment of underground garage show that the proposed method improves the average accuracy of the stereo visual inertial ORB-SLAM3 by 32%. © 2022 Huazhong University of Science and Technology. All rights reserved.
引用
收藏
页码:85 / 89and95
页数:8910
相关论文
共 12 条
  • [1] ANASTASIOS I M, STERGIOS I R., A multi-state constraint Kalman filter for vision-aided inertial navigation [C], Proc of IEEE International Conference on Robotics and Automation, pp. 10-14, (2007)
  • [2] SUN K,, MOHTA K,, PFROMMER B, Robust stereo visual inertial odometry for fast autonomous flight [J], IEEE Robotics and Automation Letters, 3, 2, pp. 965-972, (2018)
  • [3] LUPTON T,, SUKKARIEH S., Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions[J], IEEE Transactions on Robotics, 28, 1, pp. 61-76, (2012)
  • [4] FORSTER C,, CARLONE L,, DELLAERT F, On-manifold pre-integration for real-time visual-inertial odometry[J], IEEE Transactions on Robotics, 33, 1, pp. 1-21, (2017)
  • [5] QIN T, LI P, SHEN S., Vins-mono:a robust and versatile monocular visual-inertial state estimator[J], IEEE Transactions on Robotics, 34, 4, pp. 1004-1020, (2018)
  • [6] MUR-ARTAL R,, MONTIEL J M M,, TARDOS J D., ORB-SLAM:a versatile and accurate monocular SLAM system[J], IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
  • [7] MUR-ARTAL R, TARDOS J D., Visual-inertial monocular SLAM with map reuse[J], IEEE Robotics and Automation Letters, 2, 2, pp. 796-803, (2017)
  • [8] MUR-ARTAL R,, TARDOS J D., ORB-SLAM2: an open source slam system for monocular, stereo, and RGB-D cameras[J], IEEE Transactions on Robotics, 33, 5, pp. 1-8, (2017)
  • [9] CARLOS C, MONTIEL J M M, TARDOS J D., Inertial-only optimization for visual-inertial initialization[C], Proc of IEEE International Conference on Robotics and Automation, pp. 51-57, (2020)
  • [10] RONG K, XIONG L, MINGYU X, VINS-vehicle:a tightly-coupled vehicle dynamics extension to visual-inertial state estimator[C], Proc of IEEE Intelligent Transportation Systems Conference, pp. 3593-3600, (2019)