RLD-SLAM: A Robust Lightweight VI-SLAM for Dynamic Environments Leveraging Semantics and Motion Information

被引:13
|
作者
Zheng, Zengrui [1 ,2 ]
Lin, Shifeng [1 ,2 ]
Yang, Chenguang [3 ]
机构
[1] South China Univ Technol, Sch Automat Sci & Engn, Key Lab Autonomous Syst & Networked Control, Minist Educ, Guangzhou 510640, Peoples R China
[2] South China Univ Technol, GuangDong Engn Technol Res Ctr Control Intelligent, Sch Automat Sci & Engn, Guangzhou, Peoples R China
[3] Univ Liverpool, Dept Comp Sci, Liverpool L69 3BX, England
关键词
Mobile robot; multisensor fusion; robot state estimation; simultaneous localization and mapping (SLAM); TRACKING;
D O I
10.1109/TIE.2024.3363744
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Existing mainstream dynamic simultaneous localization and mapping (SLAM) can be categorized into image segmentation-based and object detection-based methods. The former achieves high accuracy but suffers a heavy computational burden, while the latter operates at higher speeds but with lower accuracy. In this article, we propose robust lightweight dynamic SLAM (RLD-SLAM), a robust lightweight visual-inertial SLAM for dynamic environments, leveraging semantics, and motion information. Our novel approach combines object detection and Bayesian filtering to maintain high accuracy while quickly acquiring static feature points. In addition, to address the challenge of semantic-based dynamic SLAM in highly dynamic scenes, RLD-SLAM leverages motion information from the inertial measurement unit to assist in tracking dynamic objects and maximizes the utilization of static feature in the environment. We conduct experiments applying our proposed method on indoor, outdoor datasets, and unmanned ground vehicles. The experimental results demonstrate that our method surpasses the current state-of-the-art algorithms, particularly in highly dynamic environments.
引用
收藏
页码:14328 / 14338
页数:11
相关论文
共 49 条
  • [41] CD-SLAM: A Real-Time Stereo Visual-Inertial SLAM for Complex Dynamic Environments With Semantic and Geometric Information
    Wen, Shuhuan
    Tao, Sheng
    Liu, Xin
    Babiarz, Artur
    Yu, F. Richard
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2024, 73 : 1 - 8
  • [42] A Multi-Strategy Visual SLAM System for Motion Blur Handling in Indoor Dynamic Environments
    Huai, Shuo
    Cao, Long
    Zhou, Yang
    Guo, Zhiyang
    Gai, Jingyao
    SENSORS, 2025, 25 (06)
  • [43] A Visual SLAM Robust against Dynamic Objects Based on Hybrid Semantic-Geometry Information
    Miao, Sheng
    Liu, Xiaoxiong
    Wei, Dazheng
    Li, Changze
    ISPRS INTERNATIONAL JOURNAL OF GEO-INFORMATION, 2021, 10 (10)
  • [44] A robust RGB-D SLAM based on multiple geometric features and semantic segmentation in dynamic environments
    Kuang, Benfa
    Yuan, Jie
    Liu, Qiang
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2023, 34 (01)
  • [45] Robust RGB: D-SLAM in highly dynamic environments based on probability observations and clustering optimization
    Liu, Hailin
    Tian, Liangfang
    Du, Qiliang
    Xu, Wenjie
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (03)
  • [46] COEB-SLAM: A Robust VSLAM in Dynamic Environments Combined Object Detection, Epipolar Geometry Constraint, and Blur Filtering
    Min, Feiyan
    Wu, Zibin
    Li, Deping
    Wang, Gao
    Liu, Ning
    IEEE SENSORS JOURNAL, 2023, 23 (21) : 26279 - 26291
  • [47] Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments
    Wang, Kesai
    Yao, Xifan
    Ma, Nanfeng
    Jing, Xuan
    NEURAL COMPUTING & APPLICATIONS, 2023, 35 (12): : 8707 - 8722
  • [48] IDMF-VINS: Improving Visual-Inertial SLAM for Complex Dynamic Environments With Motion Consistency and Feature Filtering
    Peng, Xuanzhi
    Tong, Pengfei
    Yang, Xuerong
    Wang, Chen
    Zou, An-Min
    IEEE SENSORS JOURNAL, 2025, 25 (04) : 6995 - 7005
  • [49] Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments
    Kesai Wang
    Xifan Yao
    Nanfeng Ma
    Xuan Jing
    Neural Computing and Applications, 2023, 35 : 8707 - 8722