Static Multitarget-Based Autocalibration of RGB Cameras, 3-D Radar, and 3-D Lidar Sensors

被引:3
|
作者
Agrawal, Shiva [1 ]
Bhanderi, Savankumar [1 ]
Doycheva, Kristina [2 ]
Elger, Gordon [3 ,4 ]
机构
[1] Tech Hsch Ingolstadt, Inst Innovat Mobil, D-85049 Ingolstadt, Germany
[2] Fraunhofer Inst Transportat & Infrastruct Syst IV, Appl Ctr Connected Mobil & Infrastruct, D-85051 Ingolstadt, Germany
[3] Tech Hsch Ingolstadt, D-85049 Ingolstadt, Germany
[4] Fraunhofer IVI, Appl Ctr Connected Mobil & Infrastruct, D-85051 Ingolstadt, Germany
关键词
Autonomous vehicles; camera; feature extraction; intelligent roadside infrastructure; light detection and ranging (lidar); radio detection and ranging (radar); sensor calibration;
D O I
10.1109/JSEN.2023.3300957
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
For environmental perception, autonomous vehicles and intelligent roadside infrastructure systems contain multiple sensors, that is, radio detection and ranging (radar), light detection and ranging (lidar), and camera sensors with the aim to detect, classify, and track multiple road users. Data from multiple sensors are fused together to enhance the perception quality of the sensor system because each sensor has strengths and weaknesses, for example, resolution, distance measurement, and dependency on weather conditions. For data fusion, it is necessary to transform the data from the different sensor coordinates to a common coordinate frame. This process is referred to as multisensor calibration and is a challenging task, which is mostly performed manually. This article introduces a new method for autocalibrating 3-D radar, 3-D lidar, and red-green-blue (RGB) mono-camera sensors using a static multitarget-based system. The proposed method can be used with sensors operating at different frame rates without time synchronization. Furthermore, the described static multitarget system is cost-effective, easy to build, and applicable for short- to long-distance calibration. The experimental results for multiple sets of measurements show good results with projection errors measured as maximum root mean square error (RMSE) of (u, v) = (2.4, 1.8) pixels for lidar-to-camera calibration, RMSE of (u, v) = (2.2, 3.0) pixels for 3-D radar-to-camera calibration, and RMSE of (x, y, z) = (2.6, 2.7, 14.0) centimeters for 3-D radar-to-lidar calibration.
引用
收藏
页码:21493 / 21505
页数:13
相关论文
共 50 条
  • [1] Change Their Perception RGB-D Cameras for 3-D Modeling and Recognition
    Ren, Xiaofeng
    Fox, Dieter
    Konolige, Kurt
    [J]. IEEE ROBOTICS & AUTOMATION MAGAZINE, 2013, 20 (04) : 49 - 59
  • [2] 2-D/3-D fusion-based robust pose normalisation of 3-D livestock from multiple RGB-D cameras
    Lu, Jie
    Guo, Hao
    Du, Ao
    Su, Yang
    Ruchay, Alexey
    Marinello, Francesco
    Pezzuolo, Andrea
    [J]. BIOSYSTEMS ENGINEERING, 2022, 223 : 129 - 141
  • [3] 3-D RADAR
    HOPKINS, C
    [J]. IEEE SPECTRUM, 1977, 14 (04) : 10 - 10
  • [4] 3-D or not 3-D
    Adam Powell
    [J]. JOM, 2002, 54 : 22 - 24
  • [5] 3-D OR NOT 3-D
    SMITH, CW
    [J]. NEW SCIENTIST, 1984, 102 (1407) : 40 - 44
  • [6] 3-D OR NOT 3-D
    KERBEL, M
    [J]. FILM COMMENT, 1980, 16 (06) : 11 - 20
  • [7] 3-D OR NOT 3-D
    Kehr, Dave
    [J]. FILM COMMENT, 2010, 46 (01) : 60 - 67
  • [8] 3-D or not 3-D
    Powell, A
    [J]. JOM-JOURNAL OF THE MINERALS METALS & MATERIALS SOCIETY, 2002, 54 (01): : 22 - 24
  • [9] Virtual Mirror Rendering With Stationary RGB-D Cameras and Stored 3-D Background
    Shen, Ju
    Su, Po-Chang
    Cheung, Sen-ching Samson
    Zhao, Jian
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2013, 22 (09) : 3433 - 3448
  • [10] 3-D Mapping With an RGB-D Camera
    Endres, Felix
    Hess, Juergen
    Sturm, Juergen
    Cremers, Daniel
    Burgard, Wolfram
    [J]. IEEE TRANSACTIONS ON ROBOTICS, 2014, 30 (01) : 177 - 187