Asynchronous event feature generation and tracking based on gradient descriptor for event cameras

被引:4
|
作者
Li, Ruoxiang [1 ]
Shi, Dianxi [2 ,3 ]
Zhang, Yongjun [2 ]
Li, Ruihao [2 ,3 ]
Wang, Mingkun [1 ]
机构
[1] Natl Univ Def Technol, Changsha, Peoples R China
[2] Natl Innovat Inst Def Technol NIIDT, Artificial Intelligence Res Ctr AIRC, Beijing 100166, Peoples R China
[3] Tianjin Artificial Intelligence Innovat Ctr TAIIC, Tianjin, Peoples R China
基金
中国国家自然科学基金;
关键词
Robotics; event camera; feature descriptor; feature tracking; SLAM; RGB-D SLAM; MOTION REMOVAL; ROBUST;
D O I
10.1177/17298814211027028
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Recently, the event camera has become a popular and promising vision sensor in the research of simultaneous localization and mapping and computer vision owing to its advantages: low latency, high dynamic range, and high temporal resolution. As a basic part of the feature-based SLAM system, the feature tracking method using event cameras is still an open question. In this article, we present a novel asynchronous event feature generation and tracking algorithm operating directly on event-streams to fully utilize the natural asynchronism of event cameras. The proposed algorithm consists of an event-corner detection unit, a descriptor construction unit, and an event feature tracking unit. The event-corner detection unit addresses a fast and asynchronous corner detector to extract event-corners from event-streams. For the descriptor construction unit, we propose a novel asynchronous gradient descriptor inspired by the scale-invariant feature transform descriptor, which helps to achieve quantitative measurement of similarity between event feature pairs. The construction of the gradient descriptor can be decomposed into three stages: speed-invariant time surface maintenance and extraction, principal orientation calculation, and descriptor generation. The event feature tracking unit combines the constructed gradient descriptor and an event feature matching method to achieve asynchronous feature tracking. We implement the proposed algorithm in C++ and evaluate it on a public event dataset. The experimental results show that our proposed method achieves improvement in terms of tracking accuracy and real-time performance when compared with the state-of-the-art asynchronous event-corner tracker and with no compromise on the feature tracking lifetime.
引用
收藏
页数:13
相关论文
共 50 条
  • [1] Visual Tracking Using Neuromorphic Asynchronous Event-Based Cameras
    Ni, Zhenjiang
    Ieng, Sio-Hoi
    Posch, Christoph
    Regnier, Stephane
    Benosman, Ryad
    [J]. NEURAL COMPUTATION, 2015, 27 (04) : 925 - 953
  • [2] Enhancing robustness in asynchronous feature tracking for event cameras through fusing frame steams
    Xu, Haidong
    Yu, Shumei
    Jin, Shizhao
    Sun, Rongchuan
    Chen, Guodong
    Sun, Lining
    [J]. COMPLEX & INTELLIGENT SYSTEMS, 2024, 10 (05) : 6885 - 6899
  • [3] Asynchronous Corner Detection and Tracking for Event Cameras in Real Time
    Alzugaray, Ignacio
    Chli, Margarita
    [J]. IEEE ROBOTICS AND AUTOMATION LETTERS, 2018, 3 (04): : 3177 - 3184
  • [4] Data-driven Feature Tracking for Event Cameras
    Messikommer, Nico
    Fang, Carter
    Gehrig, Mathias
    Scaramuzza, Davide
    [J]. 2023 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, CVPR, 2023, : 5642 - 5651
  • [5] Asynchronous Multi-Hypothesis Tracking of Features with Event Cameras
    Alzugaray, Ignacio
    Chli, Margarita
    [J]. 2019 INTERNATIONAL CONFERENCE ON 3D VISION (3DV 2019), 2019, : 269 - 278
  • [6] BAYES CLASSIFICATION FOR ASYNCHRONOUS EVENT-BASED CAMERAS
    Fillatre, Lionel
    [J]. 2015 23RD EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO), 2015, : 824 - 828
  • [7] Asynchronous Blob Tracker for Event Cameras
    Wang, Ziwei
    Molloy, Timothy
    Van Goor, Pieter
    Mahony, Robert
    [J]. IEEE Transactions on Robotics, 2024, 40 : 4750 - 4767
  • [8] Powerline Tracking with Event Cameras
    Dietsche, Alexander
    Cioffi, Giovanni
    Hidalgo-Carrio, Javier
    Scaramuzza, Davide
    [J]. 2021 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2021, : 6990 - 6997
  • [9] Intensity/Inertial Integration-Aided Feature Tracking on Event Cameras
    Li, Zeyu
    Liu, Yong
    Zhou, Feng
    Li, Xiaowan
    [J]. REMOTE SENSING, 2022, 14 (08)
  • [10] An Asynchronous Kalman Filter for Hybrid Event Cameras
    Wang, Ziwei
    Ng, Yonhon
    Scheerlinck, Cedric
    Mahony, Robert
    [J]. 2021 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2021), 2021, : 438 - 447