Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments

被引:6
|
作者
Wang, Kesai [1 ]
Yao, Xifan [1 ]
Ma, Nanfeng [1 ]
Jing, Xuan [1 ]
机构
[1] South China Univ Technol, Sch Mech & Automot Engn, Guangzhou 510641, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2023年 / 35卷 / 12期
基金
中国国家自然科学基金;
关键词
Simultaneous localization and mapping (SLAM); Dynamic environments; Ego-motion; Pose estimation; ROBUST;
D O I
10.1007/s00521-022-07879-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The traditional visual simultaneous localization and mapping (SLAM) systems have a high dependence on static world assumption, which makes them easy to fail to track in dynamic environments. In this paper, we propose a real-time dynamic visual SLAM system (RTDSLAM) based on ORB-SLAM3 to realize accurate pose estimation of the camera in indoor dynamic environments. We regard the static objects in the environments as a complete virtual rigid body and add two motion removal modules to handle the dynamic feature points without the aid of the camera's ego motion. The geometry-based motion removal module utilizes the point correlations and the structural invariance of rigid body to detect sparse dynamic feature points between two keyframes, and the clustering of depth images helps find the complete dynamic regions. Meanwhile, the template-based motion removal module uses template matching to fast track the known moving objects between ordinary frames. The dynamic feature points located on moving objects are removed and only static feature points are reserved for pose estimation. We evaluate our system on public TUM and Bonn datasets, and the comparison with state-of-the-art dynamic visual SLAM systems shows our advantages both in runtime and the accuracy of pose estimation. Besides, the test in real-world scenes shows the effectiveness of our system in dynamic environments.
引用
收藏
页码:8707 / 8722
页数:16
相关论文
共 50 条
  • [1] Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments
    Kesai Wang
    Xifan Yao
    Nanfeng Ma
    Xuan Jing
    Neural Computing and Applications, 2023, 35 : 8707 - 8722
  • [2] RGB-D SLAM in Dynamic Environments Using Point Correlations
    Dai, Weichen
    Zhang, Yu
    Li, Ping
    Fang, Zheng
    Scherer, Sebastian
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2022, 44 (01) : 373 - 389
  • [3] Motion removal for reliable RGB-D SLAM in dynamic environments
    Sun, Yuxiang
    Liu, Ming
    Meng, Max Q. -H.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2018, 108 : 115 - 128
  • [4] Towards Real-time Semantic RGB-D SLAM in Dynamic Environments
    Ji, Tete
    Wang, Chen
    Xie, Lihua
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2021), 2021, : 11175 - 11181
  • [5] Strong-SLAM: real-time RGB-D visual SLAM in dynamic environments based on StrongSORT
    Huang, Wei
    Zou, Chunlong
    Yun, Juntong
    Jiang, Du
    Huang, Li
    Liu, Ying
    Jiang, Guo Zhang
    Xie, Yuanmin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (12)
  • [6] Improving RGB-D SLAM in dynamic environments: A motion removal approach
    Sun, Yuxiang
    Liu, Ming
    Meng, Max Q. -H.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2017, 89 : 110 - 122
  • [7] Real-time SLAM algorithm based on RGB-D data
    Fu, Mengyin
    Lü, Xianwei
    Liu, Tong
    Yang, Yi
    Li, Xinghe
    Li, Yu
    Jiqiren/Robot, 2015, 37 (06): : 683 - 692
  • [8] Accurate RGB-D SLAM in dynamic environments based on dynamic visual feature removal
    Chenxin Liu
    Jiahu Qin
    Shuai Wang
    Lei Yu
    Yaonan Wang
    Science China Information Sciences, 2022, 65
  • [9] Accurate RGB-D SLAM in dynamic environments based on dynamic visual feature removal
    Liu, Chenxin
    Qin, Jiahu
    Wang, Shuai
    Yu, Lei
    Wang, Yaonan
    SCIENCE CHINA-INFORMATION SCIENCES, 2022, 65 (10)
  • [10] Accurate RGB-D SLAM in dynamic environments based on dynamic visual feature removal
    Chenxin LIU
    Jiahu QIN
    Shuai WANG
    Lei YU
    Yaonan WANG
    Science China(Information Sciences), 2022, 65 (10) : 256 - 269