A GPS-Laser-IMU Fusion Mapping Algorithm Based on Iterated Kalman Filter

被引:0
|
作者
Cong M. [1 ]
Wen X. [1 ]
Wang M. [1 ]
Liu D. [1 ]
机构
[1] School of Mechanical Engineering, Dalian University of Technology, Liaoning, Dalian
基金
中国国家自然科学基金;
关键词
gravity optimization; iterated Kalman filter; laser SLAM(simultaneous localization and mapping); multi-sensor fusion;
D O I
10.12141/j.issn.1000-565X.220667
中图分类号
学科分类号
摘要
3D laser SLAM in outdoor large-scale scenes has been a challenging problem in the current field of robot navigation and environment sensing. Due to the instability of GPS signals in some environments and the error accumulation property of laser SLAM, traditional algorithms perform poorly in large-scale scenes. In order to solve the problems of error accumulation of 3D laser SLAM (simultaneous localization and mapping) in outdoor large-scale scenes, this paper proposed a GPS-Laser-IMU fusion mapping algorithm based on iterated Kalman filter, which improves the mapping accuracy and robustness. The algorithm used IMU to predict the robot state, while laser and GPS data were used as observations to update the robot state. The measurement equations and the related Jacobian matrix were derived. Integrating the absolute position information of GPS data in odometer can solve the problem of error accumulation in long-time operation. In the environment with sparse features, the algorithm may collapse due to insufficient constraints. The introduction of GPS can improve the robustness of the system. Additionally, gravity plays a crucial role in the prediction of robot state. Although gravity is a three-dimensional vector, the module length of gravity remains constant when the region does not change, so gravity was treated as a two-degree-of-freedom vector. Transforming gravity optimization into optimization on SO(3) successfully avoids over-parameterization issues, thereby improving precision. Its performance was compared with that of other algorithms in outdoor environments and the robustness and accuracy were verified in large scale scenes. The results show that the root mean square error of this algorithm is 0. 089m, which is 54% lower than that of other algorithms. © 2024 South China University of Technology. All rights reserved.
引用
收藏
页码:75 / 83
页数:8
相关论文
共 24 条
  • [1] DURRANT-WHYTE H,, BAILEY T., Simultaneous localization and mapping: part I [J], IEEE Robotics & Automation Magazine, 13, 2, pp. 99-110, (2006)
  • [2] MCKAY N D., A method for registration of 3D shapes, IEEE Transactions on Pattern Analycis & Machine Intelligence, 14, 2, pp. 239-256, (1992)
  • [3] Proceedings of Robotics: Science and Systems, pp. 435-442, (2009)
  • [4] ZHANG J, SINGH S., Visual-lidar odometry and mapping:low-drift,robust,and fast [C]∥Proceedings of the IEEE International Conference on Robotics and Automation, pp. 2174-2181, (2015)
  • [5] LeGO-LOAM:lightweight and ground-optimized lidar odometry and mapping on variable terrain, Proceedings of 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems(IROS), pp. 4758-4765, (2019)
  • [6] SHAN T, LIO-SAM:tightly-coupled lidar inertial odometry via smoothing and mapping, Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 5135-5142, (2020)
  • [7] WANG H, CHEN C L, F-loam:fast lidar odometry and mapping, Proceedings of 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4390-4396, (2021)
  • [8] ZHANG F., Loam livox:a fast,robust,high-precision lidar odometry and mapping package for lidars of small fov [C]∥Proceedings of 2020 IEEE International Conference on Robotics and Automation(ICRA), pp. 3126-3131, (2020)
  • [9] LYNEN S,, ACHTELIK M W,, WEISS S, Robust and modular multi-sensor fusion approach applied to mav navigation, Proceedings of 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3923-3929, (2013)
  • [10] ZHEN W,, ZENG S,, SOBERER S., Robust localization and localizability estimation with a rotating laser scanner, Proceedings of 2017 IEEE International Conference on Robotics and Automation(ICRA), pp. 6240-6245, (2017)