Robust RGB-D visual odometry based on edges and points

被引:17
|
作者
Yao, Erliang [1 ]
Zhang, Hexin [1 ]
Xu, Hui [1 ]
Song, Haitao [1 ]
Zhang, Guoliang [2 ]
机构
[1] High Tech Inst Xian, Dept Control Engn, Xian, Shaanxi, Peoples R China
[2] Chengdu Univ Informat Technol, Coll Controlling Engn, Chengdu, Sichuan, Peoples R China
关键词
Localization; Visual odometry; Dynamic environments; Edge alignment; Bundle adjustment; SLAM;
D O I
10.1016/j.robot.2018.06.009
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Localization in unknown environments is a fundamental requirement for robots. Egomotion estimation based on visual information is a hot research topic. However, most visual odometry (VO) or visual Simultaneous Localization and Mapping (vSLAM) approaches assume static environments. To achieve robust and precise localization in dynamic environments, we propose a novel VO based on edges and points for RGB-D cameras. In contrast to dense motion segmentation, sparse edge alignment with distance transform (DT) errors is adopted to detect the states of image areas. Features in dynamic areas are ignored in egomotion estimation with reprojection errors. Meanwhile, static weights calculated by DT errors are added to pose estimation. Furthermore, local bundle adjustment is utilized to improve the consistencies of the local map and the camera localization. The proposed approach can be implemented in real time. Experiments are implemented on the challenging sequences of the TUM RGB-D dataset. The results demonstrate that the proposed robust VO achieves more accurate and more stable localization than the state-of-the-art robust VO or SLAM approaches in dynamic environments. (C) 2018 Elsevier B.V. All rights reserved.
引用
收藏
页码:209 / 220
页数:12
相关论文
共 50 条
  • [31] An RGB-D SLAM Algorithm Based on Semi-direct Visual Odometry
    Gu X.
    Yang M.
    Zhang Y.
    Liu K.
    Jiqiren/Robot, 2020, 42 (01): : 39 - 48
  • [32] A Fast Visual Odometry and Mapping System for RGB-D Cameras
    Silva, Bruno M. F.
    Goncalves, Luiz M. G.
    2014 2ND BRAZILIAN ROBOTICS SYMPOSIUM (SBR) / 11TH LATIN AMERICAN ROBOTICS SYMPOSIUM (LARS) / 6TH ROBOCONTROL WORKSHOP ON APPLIED ROBOTICS AND AUTOMATION, 2014, : 55 - 60
  • [33] Robust RGB-D Visual Odometry Based on the Line Intersection Structure Feature in Low-Textured Scenes
    Li, Xianlong
    Zhang, Chongyang
    PROCEEDINGS OF 2018 5TH IEEE INTERNATIONAL CONFERENCE ON CLOUD COMPUTING AND INTELLIGENCE SYSTEMS (CCIS), 2018, : 390 - 394
  • [34] Fast Visual Odometry and Mapping from RGB-D Data
    Dryanovski, Ivan
    Valenti, Roberto G.
    Xiao, Jizhong
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 2305 - 2310
  • [35] A Novel Hybrid Visual Odometry Using an RGB-D Camera
    Wang, Huiguo
    Wu, Xinyu
    Chen, Zhiheng
    He, Yong
    PROCEEDINGS 2018 33RD YOUTH ACADEMIC ANNUAL CONFERENCE OF CHINESE ASSOCIATION OF AUTOMATION (YAC), 2018, : 47 - 51
  • [36] Sparse Edge Visual Odometry using an RGB-D Camera
    Hsu, Jhih-Lei
    Lin, Huei-Yung
    2017 11TH ASIAN CONTROL CONFERENCE (ASCC), 2017, : 964 - 969
  • [37] Visual Odometry using RGB-D Camera on Ceiling Vision
    Wang, Han
    Mou, Wei
    Suratno, Hendra
    Seet, Gerald
    Li, Maohai
    Lau, M. W. S.
    Wang, Danwei
    2012 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO 2012), 2012,
  • [38] Bi-direction Direct RGB-D Visual Odometry
    Cai, Jiyuan
    Luo, Lingkun
    Hu, Shiqiang
    APPLIED ARTIFICIAL INTELLIGENCE, 2020, 34 (14) : 1137 - 1158
  • [39] Dense RGB-D visual odometry using inverse depth
    Gutierrez-Gomez, Daniel
    Mayol-Cuevas, Walterio
    Guerrero, J. J.
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2016, 75 : 571 - 583
  • [40] Robust depth-verified RGB-D visual odometry with structural regularities for indoor environments
    Xing, Jing
    Zhong, Qixue
    Liu, Jian
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (03)