4D Radar-Camera Sensor Fusion for Robust Vehicle Pose Estimation in Foggy Environments

被引:0
|
作者
Yang, Seunghoon [1 ]
Choi, Minseong [2 ]
Han, Seungho [1 ]
Choi, Keun-Ha [1 ]
Kim, Kyung-Soo [1 ]
机构
[1] Korea Advanced Institute of Science and Technology, Department of Mechanical Engineering, Daejeon,34141, Korea, Republic of
[2] Korea Advanced Institute of Science and Technology, Division of Future Vehicle, Daejeon,34141, Korea, Republic of
关键词
D O I
暂无
中图分类号
学科分类号
摘要
The integration of cameras and millimeter-wave radar into sensor fusion algorithms is essential to ensure robustness and cost effectiveness for vehicle pose estimation. Due to the low resolution of traditional radar, several studies have investigated 4D imaging radar, which provides range, Doppler, azimuth, and elevation information with high resolution. This paper presents a method for robustly estimating vehicle pose through 4D radar and camera fusion, utilizing the complementary characteristics of each sensor. Leveraging the single-view geometry of the detected vehicle bounding box, the lateral position is derived based on the camera images, and the yaw rate is calculated through feature matching between consecutive images. The high-resolution 4D radar data are used to estimate the heading angle and forward velocity of the target vehicle by leveraging the position and Doppler velocity information. Finally, an extended Kalman filter (EKF) is employed to fuse the physical quantities obtained by each sensor, resulting in more robust pose estimation results. To validate the performance of the proposed method, experiments were conducted in foggy environments, including straight and curved driving scenarios. The experimental results indicate that the performance of the camera-based method is reduced due to frame loss in visually challenging scenarios such as foggy environments, whereas the proposed method exhibits superior performance and enhanced robustness. © 2013 IEEE.
引用
收藏
页码:16178 / 16188
相关论文
共 50 条
  • [1] 4D Radar-Camera Sensor Fusion for Robust Vehicle Pose Estimation in Foggy Environments
    Yang, Seunghoon
    Choi, Minseong
    Han, Seungho
    Choi, Keun-Ha
    Kim, Kyung-Soo
    IEEE ACCESS, 2024, 12 : 16178 - 16188
  • [2] Robust Multiobject Tracking Using Mmwave Radar-Camera Sensor Fusion
    Sengupta, Arindam
    Cheng, Lei
    Cao, Siyang
    IEEE SENSORS LETTERS, 2022, 6 (10)
  • [3] Integrated Sensor Fusion Based on 4D MIMO Radar and Camera: A Solution for Connected Vehicle Applications
    Lei, Ming
    Yang, Daning
    Weng, Xiaoming
    IEEE VEHICULAR TECHNOLOGY MAGAZINE, 2022, 17 (04): : 38 - 46
  • [4] Improving Radar-Camera Fusion Network for Distance Estimation
    Samuktha, V.
    Shukla, Hershita
    Kumar, Nitish
    Tejasri, N.
    Reddy, D. Santhosh
    Rajalakshmi, P.
    2024 16TH INTERNATIONAL CONFERENCE ON COMPUTER AND AUTOMATION ENGINEERING, ICCAE 2024, 2024, : 23 - 29
  • [5] Visibility estimation in foggy conditions by in-vehicle camera and radar
    Mori, Kenji
    Kato, Terutoshi
    Takahashi, Tomokazu
    Ide, Ichiro
    Murase, Hiroshi
    Miyahara, Takayuki
    Tamatsu, Yukimasa
    ICICIC 2006: FIRST INTERNATIONAL CONFERENCE ON INNOVATIVE COMPUTING, INFORMATION AND CONTROL, VOL 2, PROCEEDINGS, 2006, : 548 - +
  • [6] WaterScenes: A Multi-Task 4D Radar-Camera Fusion Dataset and Benchmarks for Autonomous Driving on Water Surfaces
    Yao, Shanliang
    Guan, Runwei
    Wu, Zhaodong
    Ni, Yi
    Huang, Zile
    Liu, Ryan Wen
    Yue, Yong
    Ding, Weiping
    Lim, Eng Gee
    Seo, Hyungjoon
    Man, Ka Lok
    Ma, Jieming
    Zhu, Xiaohui
    Yue, Yutao
    IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2024, 25 (11) : 16584 - 16598
  • [7] Camera and Radar Sensor Fusion for Robust Vehicle Localization via Vehicle Part Localization
    Kang, Daejun
    Kum, Dongsuk
    IEEE ACCESS, 2020, 8 : 75223 - 75236
  • [8] Radar-Camera Fusion Network for Depth Estimation in Structured Driving Scenes
    Li, Shuguang
    Yan, Jiafu
    Chen, Haoran
    Zheng, Ke
    SENSORS, 2023, 23 (17)
  • [9] Automatic Radar-Camera Dataset Generation for Sensor-Fusion Applications
    Sengupta, Arindam
    Yoshizawa, Atsushi
    Cao, Siyang
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 2875 - 2882
  • [10] Sensor Fusion for Vehicle Tracking with Camera and Radar Sensor
    Kim, Kyeong-Eun
    Lee, Chang-Joo
    Pae, Dong-Sung
    Lim, Myo-Taeg
    2017 17TH INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS), 2017, : 1075 - 1077