Real-time motion removal based on point correlations for RGB-D SLAM in indoor dynamic environments

被引:6
|
作者
Wang, Kesai [1 ]
Yao, Xifan [1 ]
Ma, Nanfeng [1 ]
Jing, Xuan [1 ]
机构
[1] South China Univ Technol, Sch Mech & Automot Engn, Guangzhou 510641, Guangdong, Peoples R China
来源
NEURAL COMPUTING & APPLICATIONS | 2023年 / 35卷 / 12期
基金
中国国家自然科学基金;
关键词
Simultaneous localization and mapping (SLAM); Dynamic environments; Ego-motion; Pose estimation; ROBUST;
D O I
10.1007/s00521-022-07879-x
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
The traditional visual simultaneous localization and mapping (SLAM) systems have a high dependence on static world assumption, which makes them easy to fail to track in dynamic environments. In this paper, we propose a real-time dynamic visual SLAM system (RTDSLAM) based on ORB-SLAM3 to realize accurate pose estimation of the camera in indoor dynamic environments. We regard the static objects in the environments as a complete virtual rigid body and add two motion removal modules to handle the dynamic feature points without the aid of the camera's ego motion. The geometry-based motion removal module utilizes the point correlations and the structural invariance of rigid body to detect sparse dynamic feature points between two keyframes, and the clustering of depth images helps find the complete dynamic regions. Meanwhile, the template-based motion removal module uses template matching to fast track the known moving objects between ordinary frames. The dynamic feature points located on moving objects are removed and only static feature points are reserved for pose estimation. We evaluate our system on public TUM and Bonn datasets, and the comparison with state-of-the-art dynamic visual SLAM systems shows our advantages both in runtime and the accuracy of pose estimation. Besides, the test in real-world scenes shows the effectiveness of our system in dynamic environments.
引用
收藏
页码:8707 / 8722
页数:16
相关论文
共 50 条
  • [21] RGB-D Sensor Based Real-time 6DoF-SLAM
    Chen, Hsi-Yuan
    Lin, Chyi-Yeu
    2014 INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS AND INTELLIGENT SYSTEMS (ARIS 2014), 2014, : 61 - 65
  • [22] GPU-Based Real-Time RGB-D 3D SLAM
    Lee, Donghwa
    Kim, Hyongjin
    Myung, Hyun
    2012 9TH INTERNATIONAL CONFERENCE ON UBIQUITOUS ROBOTS AND AMBIENT INTELLIGENCE (URAL), 2012, : 46 - 48
  • [23] Dense Piecewise Planar RGB-D SLAM for Indoor Environments
    Le, Phi-Hung
    Kosecka, Jana
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 4944 - 4949
  • [24] Robust Real-time RGB-D Visual Odometry in Dynamic Environments via Rigid Motion Model
    Lee, Sangil
    Son, Clark Youngdong
    Kim, H. Jin
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 6891 - 6898
  • [25] Real-time 3-D Mapping for Indoor Environments using RGB-D Cameras
    Chen, Liang-Chia
    Nguyen Van Thai
    ADVANCED MANUFACTURING FOCUSING ON MULTI-DISCIPLINARY TECHNOLOGIES, 2012, 579 : 435 - 444
  • [26] RGB-D Object SLAM Using Quadrics for Indoor Environments
    Liao, Ziwei
    Wang, Wei
    Qi, Xianyu
    Zhang, Xiaoyu
    SENSORS, 2020, 20 (18) : 1 - 34
  • [27] Ground Enhanced RGB-D SLAM for Dynamic Environments
    Guo, Ruibin
    Liu, Xinghua
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE-ROBIO 2021), 2021, : 1171 - 1177
  • [28] Robust and Efficient RGB-D SLAM in Dynamic Environments
    Yang, Xin
    Yuan, Zikang
    Zhu, Dongfu
    Chi, Cheng
    Li, Kun
    Liao, Chunyuan
    IEEE TRANSACTIONS ON MULTIMEDIA, 2021, 23 : 4208 - 4219
  • [29] Semantic Segmentation based Dense RGB-D SLAM in Dynamic Environments
    Zhang, Jianbo
    Liu, Yanjie
    Chen, Junguo
    Ma, Liulong
    Jin, Dong
    Chen, Jiao
    2019 3RD INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE, AUTOMATION AND CONTROL TECHNOLOGIES (AIACT 2019), 2019, 1267
  • [30] RGB-D SLAM Algorithm in Indoor Dynamic Environments Based on Gridding Segmentation and Dual Map Coupling
    Ai Q.
    Wang W.
    Liu G.
    Jiqiren/Robot, 2022, 44 (04): : 431 - 442