Multi-Modality Sensing and Data Fusion for Multi-Vehicle Detection

被引:30
|
作者
Roy, Debashri [1 ]
Li, Yuanyuan [1 ]
Jian, Tong [1 ]
Tian, Peng [1 ]
Chowdhury, Kaushik [1 ]
Ioannidis, Stratis [1 ]
机构
[1] Northeastern Univ, Dept Elect, Comp Engn, Boston, MA 02115 USA
基金
美国国家科学基金会;
关键词
Vehicle detection; tracking; multimodal data; fusion; latent embeddings; image; seismic; acoustic; radar; CHALLENGES; TRACKING;
D O I
10.1109/TMM.2022.3145663
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
With the recent surge in autonomous driving vehicles, the need for accurate vehicle detection and tracking is critical now more than ever. Detecting vehicles from visual sensors fails in non-line-of-sight (NLOS) settings. This can be compensated by the inclusion of other modalities in a multi-domain sensing environment. We propose several deep learning based frameworks for fusing different modalities (image, radar, acoustic, seismic) through the exploitation of complementary latent embeddings, incorporating multiple state-of-the-art fusion strategies. Our proposed fusion frameworks considerably outperform unimodal detection. Moreover, fusion between image and non-image modalities improves vehicle tracking and detection under NLOS conditions. We validate our models on the real-world multimodal ESCAPE dataset, showing 33.16% improvement in vehicle detection by fusion (over visual inference alone) over test scenarios with 30-42% NLOS conditions. To demonstrate how well our framework generalizes, we also validate our models on the multimodal NuScene dataset, showing similar to 22% improvement over competing methods.
引用
收藏
页码:2280 / 2295
页数:16
相关论文
共 50 条
  • [31] Graph-Based Progressive Fusion Network for Multi-Modality Vehicle Re-Identification
    He, Qiaolin
    Lu, Zefeng
    Wang, Zihan
    Hu, Haifeng
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2023, 24 (11) : 12431 - 12447
  • [32] Multi-vehicle Detection and Tracking Based on Kalman Filter and Data Association
    Guo, Lie
    Ge, Pingshu
    He, Danni
    Wang, Dongxing
    INTELLIGENT ROBOTICS AND APPLICATIONS, ICIRA 2019, PT V, 2019, 11744 : 439 - 449
  • [33] A Multi-Modality Fusion and Gated Multi-Filter U-Net for Water Area Segmentation in Remote Sensing
    Wang, Rongfang
    Zhang, Chenchen
    Chen, Chao
    Hao, Hongxia
    Li, Weibin
    Jiao, Licheng
    REMOTE SENSING, 2024, 16 (02)
  • [34] Echoes Beyond Points: Unleashing the Power of Raw Radar Data in Multi-modality Fusion
    Liu, Yang
    Wang, Feng
    Wang, Naiyan
    Zhang, Zhaoxiang
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 36 (NEURIPS 2023), 2023,
  • [35] Vehicle Localization by using a Multi-modality Top Down Approach
    Aynaud, Claude
    Bernay-Angeletti, Coralie
    Chapuis, Roland
    Aufrere, Romuald
    Debain, Christophe
    2014 13TH INTERNATIONAL CONFERENCE ON CONTROL AUTOMATION ROBOTICS & VISION (ICARCV), 2014, : 1415 - 1420
  • [36] DDFM: Denoising Diffusion Model for Multi-Modality Image Fusion
    Zhao, Zixiang
    Bai, Haowen
    Zhu, Yuanzhi
    Zhang, Jiangshe
    Xu, Shuang
    Zhang, Yulun
    Zhang, Kai
    Meng, Deyu
    Timofte, Radu
    Van Gool, Luc
    2023 IEEE/CVF INTERNATIONAL CONFERENCE ON COMPUTER VISION (ICCV 2023), 2023, : 8048 - 8059
  • [37] Multi-modality of polysomnography signals' fusion for automatic sleep scoring
    Yan, Rui
    Zhang, Chi
    Spruyt, Karen
    Wei, Lai
    Wang, Zhiqiang
    Tian, Lili
    Li, Xueqiao
    Ristaniemi, Tapani
    Zhang, Jihui
    Cong, Fengyu
    BIOMEDICAL SIGNAL PROCESSING AND CONTROL, 2019, 49 : 14 - 23
  • [38] Fast saliency-aware multi-modality image fusion
    Han, Jungong
    Pauwels, Eric J.
    de Zeeuw, Paul
    NEUROCOMPUTING, 2013, 111 : 70 - 80
  • [39] Multi-modality image fusion for image-guided neurosurgery
    Haller, JW
    Ryken, T
    Madsen, M
    Edwards, A
    Bolinger, L
    Vannier, MW
    CARS '99: COMPUTER ASSISTED RADIOLOGY AND SURGERY, 1999, 1191 : 681 - 685
  • [40] Lymphatic flow mapping utilizing multi-modality image fusion
    Vicic, M
    Thorstad, W
    Low, D
    Deasy, J
    MEDICAL PHYSICS, 2004, 31 (06) : 1900 - 1900