Laser 3D tightly coupled mapping method based on visual information

被引:3
|
作者
Liu, Sixing [1 ]
Chai, Yan [1 ]
Yuan, Rui [1 ]
Miao, Hong [1 ]
机构
[1] Yangzhou Univ, Sch Mech Engn, Yangzhou, Peoples R China
关键词
Simultaneous positioning and mapping; Multi-sensor fusion; Motion estimation; Loopback detection; MODEL; SLAM;
D O I
10.1108/IR-02-2023-0016
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
PurposeSimultaneous localization and map building (SLAM), as a state estimation problem, is a prerequisite for solving the problem of autonomous vehicle motion in unknown environments. Existing algorithms are based on laser or visual odometry; however, the lidar sensing range is small, the amount of data features is small, the camera is vulnerable to external conditions and the localization and map building cannot be performed stably and accurately using a single sensor. This paper aims to propose a laser three dimensions tightly coupled map building method that incorporates visual information, and uses laser point cloud information and image information to complement each other to improve the overall performance of the algorithm. Design/methodology/approachThe visual feature points are first matched at the front end of the method, and the mismatched point pairs are removed using the bidirectional random sample consensus (RANSAC) algorithm. The laser point cloud is then used to obtain its depth information, while the two types of feature points are fed into the pose estimation module for a tightly coupled local bundle adjustment solution using a heuristic simulated annealing algorithm. Finally, the visual bag-of-words model is fused in the laser point cloud information to establish a threshold to construct a loopback framework to further reduce the cumulative drift error of the system over time. FindingsExperiments on publicly available data sets show that the proposed method in this paper can match its real trajectory well. For various scenes, the map can be constructed by using the complementary laser and vision sensors, with high accuracy and robustness. At the same time, the method is verified in a real environment using an autonomous walking acquisition platform, and the system loaded with the method can run well for a long time and take into account the environmental adaptability of multiple scenes. Originality/valueA multi-sensor data tight coupling method is proposed to fuse laser and vision information for optimal solution of the positional attitude. A bidirectional RANSAC algorithm is used for the removal of visual mismatched point pairs. Further, oriented fast and rotated brief feature points are used to build a bag-of-words model and construct a real-time loopback framework to reduce error accumulation. According to the experimental validation results, the accuracy and robustness of the single-sensor SLAM algorithm can be improved.
引用
收藏
页码:917 / 929
页数:13
相关论文
共 50 条
  • [1] Tightly Coupled 3D Lidar Inertial Odometry and Mapping
    Ye, Haoyang
    Chen, Yuying
    Liu, Ming
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 3144 - 3150
  • [2] Globally Consistent and Tightly Coupled 3D LiDAR Inertial Mapping
    Koide, Kenji
    Yokozuka, Masashi
    Oishi, Shuji
    Banno, Atsuhiko
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2022), 2022, : 5622 - 5628
  • [3] A Visual-attention-based 3D Mapping Method for Mobile Robots
    Guo B.
    Dai H.
    Li Z.
    Dai, Hongyue (hongyuedai@163.com), 1600, Science Press (43): : 1248 - 1256
  • [4] Tightly coupled 3D indoor SLAM based on multi-sensor
    Li, Chunlei
    Chen, Jiupeng
    San, Hongjun
    Li, Yueyang
    Peng, Zhen
    Yi Qi Yi Biao Xue Bao/Chinese Journal of Scientific Instrument, 2024, 45 (07): : 121 - 131
  • [5] Visual positioning method based on line laser 3D measurement system
    Luo, Zai
    Zhao, Hongnan
    Jiang, Wensong
    Cai, Zeliang
    Yang, Li
    TENTH INTERNATIONAL SYMPOSIUM ON PRECISION MECHANICAL MEASUREMENTS, 2021, 12059
  • [6] Loop Detection and Correction of 3D Laser-Based SLAM with Visual Information
    Zhu, Zulun
    Yang, Shaowu
    Dai, Huadong
    Li, Fu
    PROCEEDINGS OF THE 31ST INTERNATIONAL CONFERENCE ON COMPUTER ANIMATION AND SOCIAL AGENTS (CASA 2016), 2015, : 53 - 58
  • [7] Visual and LiDAR-based for The Mobile 3D Mapping
    Wu, Qiao
    Sun, Kai
    Zhang, Wenjun
    Huang, Chaobing
    Wu, Xiaochun
    2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2016, : 1522 - 1527
  • [8] Laser walkabout mapping in 3D
    不详
    FOREST PRODUCTS JOURNAL, 1996, 46 (06) : 16 - 16
  • [9] A tightly coupled SLAM method for precise urban mapping
    Sun X.
    Guan H.
    Su Y.
    Xu G.
    Guo Q.
    Cehui Xuebao/Acta Geodaetica et Cartographica Sinica, 2021, 50 (11): : 1585 - 1593
  • [10] Tightly Coupled 3D Lidar Inertial SLAM for Ground Robot
    Li, Daosheng
    Sun, Bo
    Liu, Ruyu
    Xue, Ruilei
    ELECTRONICS, 2023, 12 (07)