Tightly Coupled SLAM System Based on Multi-Sensor Fusion

被引:0
|
作者
Cai Y. [1 ]
Lu Z. [1 ]
Li Y. [1 ]
Chen L. [1 ]
Wang H. [2 ]
机构
[1] Institute of Automotive Engineering, Jiangsu University, Zhenjiang
[2] School of Automotive and Traffic Engineering, Jiangsu University, Zhenjiang
来源
关键词
Autonomous vehicles; Closed-loop detection; Multi-sensor-fusion; SLAM; State estimation;
D O I
10.19562/j.chinasae.qcgc.2022.03.006
中图分类号
学科分类号
摘要
Simultaneous localization and mapping (SLAM) is an essential part of autonomous vehicles. Its existing algorithms are mainly based on lidar or visual-inertial odometer, which do not make full use of the respective advantages of multi-mode sensors, leading to insufficient robustness to featureless scenes. In view of these, a multi-sensor tightly coupled SLAM system using lidar, camera and inertial measurement unit is proposed in this paper. Firstly, the system improves the schemes of the feature extraction of lidar point cloud and plane fitting and enhances the efficiency and accuracy of the depth information optimization of visual feature points by using lidar point cloud. Secondly, the tightly coupled state estimation framework proposed directly adds lidar odometer constraints onto visual inertial system, so enhancing the system stability and accuracy without increasing the complexity of algorithm. Finally, the coarse-to-fine visual-lidar coupled loop framework further reduces the long-term cumulative drift of the system. The results of massive tests for validation on the open-source dataset KITTI show that compared with other commonly used algorithms, the proposed algorithm achieves higher accuracy and environmental adaptability. In addition, the real vehicle test on the self-built autonomous vehicle test platform also demonstrates its adaptability to the long-time and large scene environment. © 2022, Society of Automotive Engineers of China. All right reserved.
引用
收藏
页码:350 / 361
页数:11
相关论文
共 23 条
  • [1] BAILEY T, DURRANT-WHYTE H., Simultaneous localization and mapping (SLAM): part II, IEEE Robotics & Automation Magazine, 13, 3, pp. 108-117, (2006)
  • [2] WANG H, LI Y, CAI Y, Et al., 3D real⁃time vehicle tracking based on lidar, Automotive Engineering, 43, 7, pp. 1013-1021, (2021)
  • [3] LOU Xinyu, WANG Hai, CAI Yingfeng, Et al., A research on an algorithm for real-time detection and classification of road obstacle by using 64-line lidar, Automotive Engineering, 41, 7, pp. 779-784, (2019)
  • [4] MUR-ARTAL R, MONTIEL J M M, TARDOS J D., ORB-SLAM: a versatile and accurate monocular SLAM system, IEEE Transactions on Robotics, 31, 5, pp. 1147-1163, (2015)
  • [5] MUR-ARTAL R, TARDoS J D., Orb-slam2: an open-source slam system for monocular, stereo, and rgb-d cameras, IEEE Transactions on Robotics, 33, 5, pp. 1255-1262, (2017)
  • [6] LEVINSON J, ASKELAND J, BECKER J, Et al., Towards fully autonomous driving: systems and algorithms, Proceedings of the 2011 IEEE Intelligent Vehicles Symposium (IV)
  • [7] ZHANG J, SINGH S., Low-drift and real-time lidar odometry and mapping, Autonomous Robots, 41, 2, pp. 401-416, (2017)
  • [8] LI Xingjia, LI Jianfen, ZHU Min, Et al., Research on positioning fusion and verification algorithm based on UKF, Automotive Engineering, 43, 6, pp. 825-832, (2021)
  • [9] YE H, CHEN Y, LIU M., Tightly coupled 3d lidar inertial odometry and mapping, 2019 International Conference on Robotics and Automation (ICRA), pp. 3144-3150, (2019)
  • [10] MOURIKIS A I, ROUMELIOTIS S I., A multi-state constraint Kalman filter for vision-aided inertial navigation, Proceedings 2007 IEEE International Conference on Robotics and Automation, pp. 3565-3572, (2007)