Multi-Event-Camera Depth Estimation and Outlier Rejection by Refocused Events Fusion

被引:15
|
作者
Ghosh, Suman [1 ]
Gallego, Guillermo [1 ,2 ,3 ]
机构
[1] Tech Univ Berlin, Dept Elect Engn & Comp Sci, D-10623 Berlin, Germany
[2] Einstein Ctr Digital Future, D-10117 Berlin, Germany
[3] Sci Intelligence Excellence Cluster, D-10587 Berlin, Germany
关键词
event cameras; neuromorphic processing; robotics; spatial AI; stereo depth estimation; CONTRAST MAXIMIZATION; VISUAL ODOMETRY; STEREO VISION; DATASET; MOTION; SPACE;
D O I
10.1002/aisy.202200221
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Event cameras are bio-inspired sensors that offer advantages over traditional cameras. They operate asynchronously, sampling the scene at microsecond resolution and producing a stream of brightness changes. This unconventional output has sparked novel computer vision methods to unlock the camera's potential. Here, the problem of event-based stereo 3D reconstruction for SLAM is considered. Most event-based stereo methods attempt to exploit the high temporal resolution of the camera and the simultaneity of events across cameras to establish matches and estimate depth. By contrast, this work investigates how to estimate depth without explicit data association by fusing disparity space images (DSIs) originated in efficient monocular methods. Fusion theory is developed and applied to design multi-camera 3D reconstruction algorithms that produce state-of-the-art results, as confirmed by comparisons with four baseline methods and tests on a variety of available datasets.
引用
收藏
页数:21
相关论文
共 50 条
  • [1] Dense Depth-Map Estimation Based on Fusion of Event Camera and Sparse LiDAR
    Cui, Mingyue
    Zhu, Yuzhang
    Liu, Yechang
    Liu, Yunchao
    Chen, Gang
    Huang, Kai
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2022, 71
  • [2] Interests of refocused images calibrated in depth with a multi-view camera for control by vision
    Riou, Cecile
    Colicchio, Bruno
    Lauffenburger, Jean -Philippe
    Cudel, Christophe
    THIRTEENTH INTERNATIONAL CONFERENCE ON QUALITY CONTROL BY ARTIFICIAL VISION 2017, 2017, 10338
  • [3] Probabilistic multi-modal depth estimation based on camera–LiDAR sensor fusion
    Johan S. Obando-Ceron
    Victor Romero-Cano
    Sildomar Monteiro
    Machine Vision and Applications, 2023, 34
  • [4] LiDAR - Stereo Camera Fusion for Accurate Depth Estimation
    Cholakkal, Hafeez Husain
    Mentasti, Simone
    Bersani, Mattia
    Arrigoni, Stefano
    Matteucci, Matteo
    Cheli, Federico
    2020 AEIT INTERNATIONAL CONFERENCE OF ELECTRICAL AND ELECTRONIC TECHNOLOGIES FOR AUTOMOTIVE (AEIT AUTOMOTIVE), 2020,
  • [5] IMU-Camera Data Fusion: Horizontal Plane Observation with Explicit Outlier Rejection
    Panahandeh, Ghazaleh
    Jansson, Magnus
    Hutchinson, Seth
    2013 INTERNATIONAL CONFERENCE ON INDOOR POSITIONING AND INDOOR NAVIGATION (IPIN), 2013,
  • [6] Depth cue fusion for event-based stereo depth estimation
    Ghosh, Dipon Kumar
    Jung, Yong Ju
    INFORMATION FUSION, 2025, 117
  • [7] Probabilistic multi-modal depth estimation based on camera-LiDAR sensor fusion
    Obando-Ceron, Johan S.
    Romero-Cano, Victor
    Monteiro, Sildomar
    MACHINE VISION AND APPLICATIONS, 2023, 34 (05)
  • [8] Manipulator grasping system based on multi depth camera fusion
    Hong C.
    Yang L.
    Jiang W.
    Luo Z.
    Jisuanji Jicheng Zhizao Xitong/Computer Integrated Manufacturing Systems, CIMS, 2024, 30 (02): : 435 - 444
  • [9] Skeleton estimation and tracking by means of depth data fusion from depth camera networks
    Carraro, Marco
    Munaro, Matteo
    Menegatti, Emanuele
    ROBOTICS AND AUTONOMOUS SYSTEMS, 2018, 110 : 151 - 159
  • [10] A multi-camera dataset for depth estimation in an indoor scenario
    Marin, Giulio
    Agresti, Gianluca
    Minto, Ludovico
    Zanuttigh, Pietro
    DATA IN BRIEF, 2019, 27