Cross-Modal Semidense 6-DOF Tracking of an Event Camera in Challenging Conditions

被引:2
|
作者
Zuo, Yi-Fan [1 ]
Xu, Wanting [2 ,3 ]
Wang, Xia [1 ]
Wang, Yifu [2 ,3 ]
Kneip, Laurent [2 ,3 ]
机构
[1] Beijing Inst Technol, Sch Opt & Photon, Key Lab Optoelect Imaging Technol & Syst, Minist Educ, Beijing 100081, Peoples R China
[2] ShanghaiTech Univ, Mobile Percept Lab, Shanghai 201210, Peoples R China
[3] ShanghaiTech Univ, Shanghai Engn Res Ctr Intelligent Vis & Imaging, Shanghai 201210, Peoples R China
基金
中国国家自然科学基金;
关键词
Cameras; Simultaneous localization and mapping; Visualization; Sensors; Location awareness; Lighting; Tracking; Event camera; neuromorphic sensing; semidense; tracking; visual localization; VISUAL ODOMETRY; SLAM;
D O I
10.1109/TRO.2024.3355370
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Vision-based localization is a cost-effective and, thus, attractive solution for many intelligent mobile platforms. However, its accuracy and especially robustness still suffer from low illumination conditions, illumination changes, and aggressive motion. Event-based cameras are bio-inspired visual sensors that perform well in high dynamic range conditions and have high-temporal resolution, and thus, provide an interesting alternative in such challenging scenarios. While purely event-based solutions currently do not yet produce satisfying mapping results, the present work demonstrates the feasibility of purely event-based tracking if an alternative sensor is permitted for mapping. The method relies on geometric 3-D-2-D registration of semidense maps and events, and achieves highly reliable and accurate cross-modal tracking results. Practically relevant scenarios are given by depth camera-supported tracking or map-based localization with a semidense map prior created by a regular image-based visual SLAM or structure-from-motion system. Conventional edge-based 3-D-2-D alignment is extended by a novel polarity-aware registration that makes use of signed time-surface maps obtained from event streams. We, furthermore, introduce a novel culling strategy for occluded points. Both modifications increase the speed of the tracker and its robustness against occlusions or large view-point variations. The approach is validated on many real datasets covering the abovementioned challenging conditions, and compared against similar solutions realized with regular cameras.
引用
收藏
页码:1600 / 1616
页数:17
相关论文
共 50 条
  • [1] C*: Cross-Modal Simultaneous Tracking and Rendering for 6-DoF Monocular Camera Localization Beyond Modalities
    Oishi, Shuji
    Kawamata, Yasunori
    Yokozuka, Masashi
    Koide, Kenji
    Banno, Atsuhiko
    Miura, Jun
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2020, 5 (04): : 5229 - 5236
  • [2] Line-Based 6-DoF Object Pose Estimation and Tracking With an Event Camera
    Liu, Zibin
    Guan, Banglei
    Shang, Yang
    Yu, Qifeng
    Kneip, Laurent
    IEEE TRANSACTIONS ON IMAGE PROCESSING, 2024, 33 : 4765 - 4780
  • [3] Event-Based, 6-DOF Camera Tracking from Photometric Depth Maps
    Gallego, Guillermo
    Lund, Jon E. A.
    Mueggler, Elias
    Rebecq, Henri
    Delbruck, Tobi
    Scaramuzza, Davide
    IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, 2018, 40 (10) : 2402 - 2412
  • [4] Real-Time 3D Reconstruction and 6-DoF Tracking with an Event Camera
    Kim, Hanme
    Leutenegger, Stefan
    Davison, Andrew J.
    COMPUTER VISION - ECCV 2016, PT VI, 2016, 9910 : 349 - 364
  • [5] RGB-D-E: Event Camera Calibration for Fast 6-DOF Object Tracking
    Dubeau, Etienne
    Garon, Mathieu
    Debaque, Benoit
    de Charette, Raoul
    Lalonde, Jean-Francois
    2020 IEEE INTERNATIONAL SYMPOSIUM ON MIXED AND AUGMENTED REALITY (ISMAR 2020), 2020, : 127 - 135
  • [6] Efficient 6-DoF camera pose tracking with circular edges
    Tang, Fulin
    Wu, Shaohuan
    Qian, Zhengda
    Wu, Yihong
    COMPUTER VISION AND IMAGE UNDERSTANDING, 2023, 235
  • [7] Multi-Modal Pose Representations for 6-DOF Object Tracking
    Majcher, Mateusz
    Kwolek, Bogdan
    JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, 2024, 110 (04)
  • [8] Deep 6-DOF Tracking
    Garon, Mathieu
    Lalonde, Jean-Francois
    IEEE TRANSACTIONS ON VISUALIZATION AND COMPUTER GRAPHICS, 2017, 23 (11) : 2410 - 2418
  • [9] Stereo Event-Based, 6-DOF Pose Tracking for Uncooperative Spacecraft
    Liu, Zibin
    Guan, Banglei
    Shang, Yang
    Bian, Yifei
    Sun, Pengju
    Yu, Qifeng
    IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, 2025, 63
  • [10] Multi-modal Force/Vision Sensor Fusion in 6-DOF Pose Tracking
    Alkkiomaki, Olli
    Kyrki, Ville
    Liu, Yong
    Handroos, Heikki
    Kalviainen, Heikki
    ICAR: 2009 14TH INTERNATIONAL CONFERENCE ON ADVANCED ROBOTICS, VOLS 1 AND 2, 2009, : 476 - +