An asymmetric real-time dense visual localisation and mapping system

被引:0
|
作者
Comport, Andrew I. [1 ]
Meilland, Maxime [2 ]
Rives, Patrick [2 ]
机构
[1] UNSA, CNRS I3S, 2000 Route Lucioles,BP 121, Sophia Antipolis, France
[2] INRIA Sophia Antipolis Mediterranee, Sophia Antipolis, France
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
This paper describes a dense tracking system (both monocular and multi-camera) which each perform in real-time (45Hz). The proposed approach combines a prior dense photometric model with online visual odometry which enables handling dynamic changes in the scene. In particular it will be shown how the technique takes into account large illumination variations and subsequently improves direct tracking techniques which are highly prone to illumination change. This is achieved by exploiting the relative advantages of both model-based and visual odometry techniques for tracking. In the case of direct model-based tracking, photometric models are usually acquired under significantly greater lighting differences than those observed by the current camera view, however, model-based approaches avoid drift. Incremental visual odometry, on the other hand, has relatively less lighting variation but integrates drift. To solve this problem a hybrid approach is proposed to simultaneously minimise drift via a 3D model whilst using locally consistent illumination to correct large photometric differences. Direct 6 dof tracking is performed by an accurate method, which directly minimizes dense image measurements iteratively, using non-linear optimisation. A stereo technique for automatically acquiring the 3D photometric model has also been optimised for the purpose of this paper. Real experiments are shown on complex 3D scenes for a hand-held camera undergoing fast 3D movement and various illumination changes including daylight, artificial-lights, significant shadows, non-Lambertian reflections, occlusions and saturations.(1)
引用
收藏
页数:4
相关论文
共 50 条
  • [1] Dense visual mapping of large scale environments for real-time localisation
    Meilland, Maxime
    Comport, Andrew Ian
    Rives, Patrick
    2011 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS, 2011, : 4242 - 4248
  • [2] Real-time visual workspace localisation and mapping for a wearable robot
    Davison, AJ
    Mayol, WW
    Murray, DW
    SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS, 2003, : 315 - 316
  • [3] Real-time Omnidirectional Visual SLAM with Semi-Dense Mapping
    Wang, Senbo
    Yue, Jiguang
    Dong, Yanchao
    Shen, Runjie
    Zhang, Xinyu
    2018 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2018, : 695 - 700
  • [4] RTSDM: A Real-Time Semantic Dense Mapping System for UAVs
    Li, Zhiteng
    Zhao, Jiannan
    Zhou, Xiang
    Wei, Shengxian
    Li, Pei
    Shuang, Feng
    MACHINES, 2022, 10 (04)
  • [5] Robust Real-Time Visual Odometry for Dense RGB-D Mapping
    Whelan, Thomas
    Johannsson, Hordur
    Kaess, Michael
    Leonard, John J.
    McDonald, John
    2013 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2013, : 5724 - 5731
  • [6] Real-time simultaneous localisation and mapping with a single camera
    Davison, AJ
    NINTH IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION, VOLS I AND II, PROCEEDINGS, 2003, : 1403 - 1410
  • [7] Real-time localisation and mapping with wearable active vision
    Davison, AJ
    Mayol, WW
    Murray, DW
    SECOND IEEE AND ACM INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY, PROCEEDINGS, 2003, : 18 - 27
  • [8] DTAM: Dense Tracking and Mapping in Real-Time
    Newcombe, Richard A.
    Lovegrove, Steven J.
    Davison, Andrew J.
    2011 IEEE INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV), 2011, : 2320 - 2327
  • [9] Real-time Scalable Dense Surfel Mapping
    Wang, Kaixuan
    Gao, Fei
    Shen, Shaojie
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 6919 - 6925
  • [10] Rapid-Mapping: LiDAR-Visual Implicit Neural Representations for Real-Time Dense Mapping
    Zhang, Hanwen
    Zou, Yujie
    Yan, Zhewen
    Cheng, Hui
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (09): : 8154 - 8161