Sensor fusion-based visual target tracking for autonomous vehicles

被引:0
|
作者
Jia Z. [1 ]
Balasuriya A. [2 ]
Challa S. [3 ]
机构
[1] United Technologies Research Center, Shanghai 201206, No. 3239 Shenjiang Road, Pudong
[2] Department of Mechanical Engineering, MIT, Cambridge
[3] Faculty of Engineering, University of Technology, Sydney
关键词
Autonomous vehicles; Extended Kalman filter; Image clustering and segmentation; Motion; Optical flow; Sensor data fusion; Stereo; Template matching; Vision;
D O I
10.1007/s10015-007-0499-8
中图分类号
学科分类号
摘要
In this ariticle, a data fusion based algorithm is proposed to identify and track moving objects for autonomous vehicle navigation. It is a challenging problem because both the object and the cameras are moving. Here, the optical flow vector field, color features, and stereo pair disparities are used as visual features, while the vehicle's motion-sensor data are used to determine the cameras' motion. We propose a data fusion algorithm which integrates information obtained from different visual cues and the vehicle's motion-sensor data for target-tracking. The fusion algorithm determines the velocity and position of the target in the 3D world coordinates. Next, we present a detailed description of the three-dimensional (3D) target-tracking algorithm using an extended Kalman filter. Experimental results are presented to demonstrate the performance of the proposed scheme using different natural image sequences. © 2008 International Symposium on Artificial Life and Robotics (ISAROB).
引用
收藏
页码:317 / 328
页数:11
相关论文
共 50 条
  • [1] Sensor fusion-based visual target tracking for autonomous vehicles with the out-of-sequence measurements solution
    Jia, Zhen
    Balasuriya, Arjuna
    Challa, Subhash
    [J]. ROBOTICS AND AUTONOMOUS SYSTEMS, 2008, 56 (02) : 157 - 176
  • [2] Sensor fusion based 3D target visual tracking for autonomous vehicles with IMM
    Jia, Z
    Balasuriya, A
    Challa, S
    [J]. 2005 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), VOLS 1-4, 2005, : 1829 - 1834
  • [3] A Sensor Fusion-Based GNSS Spoofing Attack Detection Framework for Autonomous Vehicles
    Dasgupta, Sagar
    Rahman, Mizanur
    Islam, Mhafuzul
    Chowdhury, Mashrur
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2022, 23 (12) : 23559 - 23572
  • [4] Sensor Fusion-Based Localization Framework for Autonomous Vehicles in Rural Forested Environments
    Matute, Jose
    Rodriguez-Arozamena, Mario
    Perez, Joshue
    Karimoddini, Ali
    [J]. 2023 IEEE 26TH INTERNATIONAL CONFERENCE ON INTELLIGENT TRANSPORTATION SYSTEMS, ITSC, 2023, : 1007 - 1013
  • [5] Sensor and Decision Fusion-Based Intrusion Detection and Mitigation Approach for Connected Autonomous Vehicles
    Moradi, Milad
    Kordestani, Mojtaba
    Jalali, Mahsa
    Rezamand, Milad
    Mousavi, Mehdi
    Chaibakhsh, Ali
    Saif, Mehrdad
    [J]. IEEE SENSORS JOURNAL, 2024, 24 (13) : 20908 - 20919
  • [6] Camera/LiDAR Sensor Fusion-based Autonomous Navigation
    Yusefi, Abdullah
    Durdu, Akif
    Toy, Ibrahim
    [J]. 2024 23RD INTERNATIONAL SYMPOSIUM INFOTEH-JAHORINA, INFOTEH, 2024,
  • [7] Semantic Fusion-based Pedestrian Detection for Supporting Autonomous Vehicles
    Sha, Mingzhi
    Boukerche, Azzedine
    [J]. 2020 IEEE SYMPOSIUM ON COMPUTERS AND COMMUNICATIONS (ISCC), 2020, : 618 - 623
  • [8] Fusion-based multi-target tracking and localization for intelligent visual surveillance systems
    Rababaah, Haroun
    Shirkhodaie, Amir
    [J]. SENSORS, AND COMMAND, CONTROL, COMMUNICATIONS, AND INTELLIGENCE (C3I) TECHNOLOGIES FOR HOMELAND SECURITY AND HOMELAND DEFENSE VII, 2008, 6943
  • [9] Data Fusion-Based Multi-Object Tracking for Unconstrained Visual Sensor Networks
    Jiang, Xiaoyan
    Fang, Zhijun
    Xiong, Neal N.
    Gao, Yongbin
    Huang, Bo
    Zhang, Juan
    Yu, Lei
    Harrington, Patrick
    [J]. IEEE ACCESS, 2018, 6 : 13716 - 13728
  • [10] Vision based autonomous vehicles target visual tracking with multiple dynamics models
    Jia, Z
    Balasuriya, A
    Challa, S
    [J]. 2005 IEEE NETWORKING, SENSING AND CONTROL PROCEEDINGS, 2005, : 1081 - 1086