Obstacle detection based on depth fusion of lidar and radar in challenging conditions

被引:10
|
作者
Xie, Guotao [1 ]
Zhang, Jing [1 ]
Tang, Junfeng [1 ]
Zhao, Hongfei [2 ]
Sun, Ning [1 ]
Hu, Manjiang [3 ]
机构
[1] Hunan Univ, Changsha, Peoples R China
[2] 31605 Troops, Nanjing, Peoples R China
[3] Hunan Univ, Coll Mech & Vehicle Engn, Changsha, Peoples R China
关键词
Obstacle detection; Challenging conditions; Depth fusion;
D O I
10.1108/IR-12-2020-0271
中图分类号
T [工业技术];
学科分类号
08 ;
摘要
Purpose To the industrial application of intelligent and connected vehicles (ICVs), the robustness and accuracy of environmental perception are critical in challenging conditions. However, the accuracy of perception is closely related to the performance of sensors configured on the vehicle. To enhance sensors' performance further to improve the accuracy of environmental perception, this paper aims to introduce an obstacle detection method based on the depth fusion of lidar and radar in challenging conditions, which could reduce the false rate resulting from sensors' misdetection. Design/methodology/approach Firstly, a multi-layer self-calibration method is proposed based on the spatial and temporal relationships. Next, a depth fusion model is proposed to improve the performance of obstacle detection in challenging conditions. Finally, the study tests are carried out in challenging conditions, including straight unstructured road, unstructured road with rough surface and unstructured road with heavy dust or mist. Findings The experimental tests in challenging conditions demonstrate that the depth fusion model, comparing with the use of a single sensor, can filter out the false alarm of radar and point clouds of dust or mist received by lidar. So, the accuracy of objects detection is also improved under challenging conditions. Originality/value A multi-layer self-calibration method is conducive to improve the accuracy of the calibration and reduce the workload of manual calibration. Next, a depth fusion model based on lidar and radar can effectively get high precision by way of filtering out the false alarm of radar and point clouds of dust or mist received by lidar, which could improve ICVs' performance in challenging conditions.
引用
收藏
页码:792 / 802
页数:11
相关论文
共 50 条
  • [1] Obstacle detection for intelligent robots based on the fusion of 2D lidar and depth camera
    Fan, Bailin
    Zhao, Hang
    Meng, Lingbei
    [J]. INTERNATIONAL JOURNAL OF HYDROMECHATRONICS, 2024, 7 (01) : 67 - 88
  • [2] RVNet: Deep Sensor Fusion of Monocular Camera and Radar for Image-Based Obstacle Detection in Challenging Environments
    John, Vijay
    Mita, Seiichi
    [J]. IMAGE AND VIDEO TECHNOLOGY (PSIVT 2019), 2019, 11854 : 351 - 364
  • [3] Detection scheme for a partially occluded pedestrian based on occluded depth in lidar-radar sensor fusion
    Kwon, Seong Kyung
    Hyun, Eugin
    Lee, Jin-Hee
    Lee, Jonghun
    Son, Sang Hyuk
    [J]. OPTICAL ENGINEERING, 2017, 56 (11)
  • [4] Obstacle Detection of Intelligent Vehicle Based on Fusion of Lidar and Machine Vision
    Sun, Binbin
    Li, Wentao
    Liu, Huibin
    Yan, Jinghao
    Gao, Song
    Feng, Penghang
    [J]. ENGINEERING LETTERS, 2021, 29 (02) : 722 - 730
  • [5] UAV low-altitude obstacle detection based on the fusion of LiDAR and camera
    Ma Z.
    Yao W.
    Niu Y.
    Lin B.
    Liu T.
    [J]. Autonomous Intelligent Systems, 1 (1):
  • [6] Unifying obstacle detection, recognition, and fusion based on millimeter wave radar and RGB-depth sensors for the visually impaired
    Long, Ningbo
    Wang, Kaiwei
    Cheng, Ruiqi
    Hu, Weijian
    Yang, Kailun
    [J]. REVIEW OF SCIENTIFIC INSTRUMENTS, 2019, 90 (04):
  • [7] Radar-lidar sensor fusion scheme using occluded depth generation for pedestrian detection
    Kwon, S. K.
    Son, S. H.
    Hyun, E.
    Lee, J. H.
    Lee, Jonghun
    [J]. PROCEEDINGS 2017 INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND COMPUTATIONAL INTELLIGENCE (CSCI), 2017, : 1811 - 1812
  • [8] An obstacle detection method by fusion of radar and motion stereo
    Kato, T
    Ninomiya, Y
    Masaki, I
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2002, 3 (03) : 182 - 188
  • [9] An obstacle detection method by fusion of radar and motion stereo
    Kato, T
    Ninomiya, Y
    Masaki, I
    [J]. SICE 2003 ANNUAL CONFERENCE, VOLS 1-3, 2003, : 689 - 694
  • [10] Obstacle detection and tracking algorithm based on multi-lidar fusion in urban environment
    Li, Jiong
    Zhang, Yu
    Liu, Xixia
    Zhang, Xudong
    Bai, Rui
    [J]. IET INTELLIGENT TRANSPORT SYSTEMS, 2021, 15 (11) : 1372 - 1387