Feature-Based SLAM Why Simultaneous Localisation and Mapping?

被引:0
|
作者
Zhao, Liang [1 ]
Mao, Zhehua [1 ]
Lluang, Shoudong [1 ]
机构
[1] Univ Technol Sydney, Ctr Autonomous Syst, Fac Engn & Informat Technol, Sydney, NSW, Australia
关键词
EFFICIENT; ALGORITHM; SYNC;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
In this paper, we first prove an interesting result for point feature based SLAM. "When the covariance matrices of feature observation errors are isotropic, the robot poses and feature positions obtained in each Gauss-Newton iteration (when solving a reformulated least squares optimisation based SIAM) are independent of the feature positions in the previous step". That is, even if we reset the feature positions to different random values before each iteration, the results after the iteration never change. Building on this linding, we propose an algorithm to solve the robot poses only ("localisation") and show that the algorithm generates exactly the same robot poses in each iteration as the Gauss-Newton method (SLAM). The optimal feature positions can be easily recovered using a closed-form formula after the optimal robot poses are obtained. Similarly, when the covariance matrices of odometry translation errors are also isotropic, we can prove that the SLAM results are independent of both the feature positions and the robot positions. Thus, we can have an "rotation-only algorithm" which generates the same robot rotations as the full SLAM. Again, the optimal robot positions and the optimal feature positions can be computed from the obtained optimal robot rotations using a closed-form formula. We have used multiple 2D and 3D SLAM datasets to demonstrate our research findings. The video shows the interesting convergence results can be found at https://youtu.be/j1T8toyGtDE . We expect the findings in this paper can help SLAM researchers to further understand the special structure of the SLAM problems and to further develop more efficient and reliable SLAM algorithms.
引用
收藏
页数:10
相关论文
共 50 条
  • [1] Feature-based SLAM for Dense Mapping
    Li, Ping
    Ke, Zhongming
    [J]. 2019 INTERNATIONAL CONFERENCE ON ADVANCED MECHATRONIC SYSTEMS (ICAMECHS), 2019, : 372 - 377
  • [2] Feature based mapping procedure with application on Simultaneous Localization and Mapping (SLAM)
    Luca, Razvan
    Troester, Fritz
    Gall, Robert
    Simion, Carmen
    [J]. ROBOTICS AND AUTOMATION SYSTEMS, 2010, 166-167 : 265 - +
  • [3] Feature-based visual simultaneous localization and mapping: a survey
    Azzam, Rana
    Taha, Tarek
    Huang, Shoudong
    Zweiri, Yahya
    [J]. SN APPLIED SCIENCES, 2020, 2 (02)
  • [4] Feature-based visual simultaneous localization and mapping: a survey
    Rana Azzam
    Tarek Taha
    Shoudong Huang
    Yahya Zweiri
    [J]. SN Applied Sciences, 2020, 2
  • [5] SIMULTANEOUS LOCALIZATION AND MAPPING: A FEATURE-BASED PROBABILISTIC APPROACH
    Skrzypczynski, Piotr
    [J]. INTERNATIONAL JOURNAL OF APPLIED MATHEMATICS AND COMPUTER SCIENCE, 2009, 19 (04) : 575 - 588
  • [6] Dense Mapping from Feature-Based Monocular SLAM Based on Depth Prediction
    Duan, Yongli
    Zhang, Jing
    Yang, Lingyu
    [J]. 2018 IEEE CSAA GUIDANCE, NAVIGATION AND CONTROL CONFERENCE (CGNCC), 2018,
  • [7] TTT SLAM: A feature-based bathymetric SLAM framework
    Zhang, Qianyi
    Kim, Jinwhan
    [J]. OCEAN ENGINEERING, 2024, 294
  • [8] Semantic Optimization of Feature-Based SLAM
    Li, Peng
    Yin, Lili
    Gao, Jiali
    Sun, Yuezhongyi
    [J]. MATHEMATICAL PROBLEMS IN ENGINEERING, 2021, 2021
  • [9] SLAM plus plus : Simultaneous Localisation and Mapping at the Level of Objects
    Salas-Moreno, Renato F.
    Newcombe, Richard A.
    Strasdat, Hauke
    Kelly, Paul H. J.
    Davison, Andrew J.
    [J]. 2013 IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR), 2013, : 1352 - 1359
  • [10] FAST INITIALIZATION FOR FEATURE-BASED MONOCULAR SLAM
    Zhang, A. Shaobo
    Liu, B. Sheng
    Zhang, C. Jianhua
    Wang, D. Zhenhua
    Wang, E. Xiaoyan
    [J]. 2017 24TH IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2017, : 2119 - 2123