Robust Stereo Visual Odometry Using Improved RANSAC-Based Methods for Mobile Robot Localization

被引:13
|
作者
Liu, Yanqing [1 ,2 ]
Gu, Yuzhang [1 ,2 ]
Li, Jiamao [1 ,2 ]
Zhang, Xiaolin [1 ,2 ]
机构
[1] Chinese Acad Sci, Biovis Syst Lab, Shanghai Inst Microsyst & Informat Technol, Shanghai 200050, Peoples R China
[2] Univ Chinese Acad Sci, Beijing 100049, Peoples R China
基金
中国国家自然科学基金;
关键词
visual odometry; stereovision; robust estimation; motion estimation; RANSAC; SLAM;
D O I
10.3390/s17102339
中图分类号
O65 [分析化学];
学科分类号
070302 ; 081704 ;
摘要
In this paper, we present a novel approach for stereo visual odometry with robust motion estimation that is faster and more accurate than standard RANSAC (Random Sample Consensus). Our method makes improvements in RANSAC in three aspects: first, the hypotheses are preferentially generated by sampling the input feature points on the order of ages and similarities of the features; second, the evaluation of hypotheses is performed based on the SPRT (Sequential Probability Ratio Test) that makes bad hypotheses discarded very fast without verifying all the data points; third, we aggregate the three best hypotheses to get the final estimation instead of only selecting the best hypothesis. The first two aspects improve the speed of RANSAC by generating good hypotheses and discarding bad hypotheses in advance, respectively. The last aspect improves the accuracy of motion estimation. Our method was evaluated in the KITTI (Karlsruhe Institute of Technology and Toyota Technological Institute) and the New Tsukuba dataset. Experimental results show that the proposed method achieves better results for both speed and accuracy than RANSAC.
引用
收藏
页数:18
相关论文
共 50 条
  • [21] Stereo Visual Odometry and Semantics based Localization of Aerial Robots in Indoor Environments
    Bavle, Hriday
    Manthe, Stephan
    de la Puente, Paloma
    Rodriguez-Ramos, Alejandro
    Sampedro, Carlos
    Campoy, Pascual
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 1018 - 1023
  • [22] A fast RANSAC-based registration algorithm for accurate localization in unknown environments using LIDAR measurements
    Fontanelli, Daniele
    Ricciato, Luigi
    Soatto, Stefano
    2007 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION SCIENCE AND ENGINEERING, VOLS 1-3, 2007, : 963 - 968
  • [23] Patch-based Stereo Direct Visual Odometry Robust to Illumination Changes
    Jung, Jae Hyung
    Heo, Sejong
    Park, Chan Gook
    INTERNATIONAL JOURNAL OF CONTROL AUTOMATION AND SYSTEMS, 2019, 17 (03) : 743 - 751
  • [24] Patch-based Stereo Direct Visual Odometry Robust to Illumination Changes
    Jae Hyung Jung
    Sejong Heo
    Chan Gook Park
    International Journal of Control, Automation and Systems, 2019, 17 : 743 - 751
  • [25] Localization of mobile robot using visual system
    Mikulova, Zuzana
    Duchon, Frantisek
    Dekan, Martin
    Babinec, Andrej
    INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS, 2017, 14 (05): : 1 - 11
  • [26] A Comprehensive Study of Deep Learning Visual Odometry for Mobile Robot Localization in Indoor Environments
    Hu, Lingxiang
    Li, Xiangjun
    Bonardi, Fabien
    Oufroukh, Naima Ait
    Ali, Sofiane Ahmed
    2024 WRC SYMPOSIUM ON ADVANCED ROBOTICS AND AUTOMATION, WRC SARA, 2024, : 286 - 291
  • [27] Rover localization from long stereo image sequences using visual odometry based on bundle adjustment
    Wan, Wenhui
    Liu, Zhaoqin
    Di, Kaichang
    REMOTE SENSING OF THE ENVIRONMENT: THE 17TH CHINA CONFERENCE ON REMOTE SENSING, 2011, 8203
  • [28] Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors
    Nemec, Dusan
    Simak, Vojtech
    Janota, Ales
    Hrubos, Marian
    Bubenikova, Emilia
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2019, 112 : 168 - 177
  • [29] Real-Time Stereo Visual Odometry Based on an Improved KLT Method
    Guo, Guangzhi
    Dai, Zuoxiao
    Dai, Yuanfeng
    APPLIED SCIENCES-BASEL, 2022, 12 (23):
  • [30] Robust Monocular Visual Odometry using Optical Flows for Mobile Robots
    Li Haifeng
    Hu Zunhe
    Chen Xinwei
    PROCEEDINGS OF THE 35TH CHINESE CONTROL CONFERENCE 2016, 2016, : 6003 - 6007