Solving method of lidar odometry based on IMU

被引:0
|
作者
Jia X. [1 ]
Xu W. [1 ]
Liu J. [1 ]
Li T. [1 ]
机构
[1] School of Mechanical Engineering, Hebei University of Technology, Tianjin
关键词
Information fusion; Lidar odometry; Motion solution; Point cloud processing;
D O I
10.19650/j.cnki.cjsi.J2007167
中图分类号
学科分类号
摘要
In the simultaneous localization and mapping (SLAM) problem, the solution accuracy of the odometry part plays a vital role in the subsequent mapping. The inertial measurement unit (IMU) can provide valuable assistance for odometry in SLAM. Based on the consideration of the movement characteristics of the planar mobile robot and the indoor environment characteristics, proposes a laser odometry solution method based on IMU loose coupling to realize the precise positioning of the odometry part. In the first stage, the point cloud information is processed in real time during the robot movement. The ground points are segmented and key points are extracted. In the second stage, the IMU information is introduced into the Kalman filter to provide the pose prior for the inter-frame matching. In the third stage, after the filter outputs the pose estimation value, the non-linear optimization method is used to match the point cloud frames to realize the solution of the odometer movement. Experimental results show that the proposed method has good stability and accuracy in laser point cloud processing and motion solving. The offset error can be controlled within 0.4%. This method provides powerful data guarantee for subsequent mapping. © 2021, Science Press. All right reserved.
引用
收藏
页码:39 / 48
页数:9
相关论文
共 17 条
  • [1] KOHLBRECHER S, STRYK O V, MEYER J, Et al., A flexible and scalable SLAM system with full 3D motion estimation, IEEE International Symposium on Safety, (2011)
  • [2] GRISETTI G, STACHNISS C, BURGARD W., Improved techniques for grid mapping with rao-blackwellized particle filters, IEEE Transactions on Robotics, 23, 1, pp. 34-46, (2007)
  • [3] ZHANG J, SINGH S., Low-drift and real-time lidar odometry and mapping, Autonomous Robots, 41, 2, pp. 401-416, (2017)
  • [4] SHAN T, ENGLOT B., LeGO-LOAM: Lightweight and ground-optimized lidar odometry and mapping on variable terrain, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 4758-4765, (2018)
  • [5] KUHNER T, KUMMERLE J., Large-scale volumetric scene reconstruction using LiDAR, 2020 IEEE International Conference on Robotics and Automation(ICRA), pp. 6261-6267, (2020)
  • [6] BEHLEY J, STACHNISS C., Efficient surfel-based SLAM using 3D laser range data in urban environments, Robotics: Science and Systems 2018, (2018)
  • [7] CHEN X, MILIOTO A, PALAZZOLO E, Et al., SuMa++: Efficient LiDAR-based semantic SLAM, 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4530-4537, (2019)
  • [8] DESCHAUD J., IMLS-SLAM: Scan-to-model matching based on 3D data, Proceedings of IEEE International Conference on Robotics and Automation, pp. 2480-2485, (2018)
  • [9] QIN T, CAO S, PAN J, Et al., A general optimization-based framework for global pose estimation with multiple sensors, (2019)
  • [10] MASCARO R, TEIXEIRA L, HINZMANN T, Et al., GOMSF: Graph-optimization based multi-sensor fusion for robust UAV pose estimation, IEEE International Conference on Robotics and Automation (ICRA), pp. 1421-1428, (2018)