Environment recognition based on multi-sensor fusion for autonomous driving vehicles

被引:12
|
作者
Weon I.-S. [1 ]
Lee S.-G. [2 ]
机构
[1] Department of Mechanical Engineering, Graduated School, Kyung Hee University
[2] Department of Mechanical Engineering, Kyung Hee University
来源
Journal of Institute of Control, Robotics and Systems | 2019年 / 25卷 / 02期
关键词
Autonomous driving; Deep learning; Environment recognition; Sensor fusion; Unmanned vehicle;
D O I
10.5302/J.ICROS.2019.18.0128
中图分类号
学科分类号
摘要
Unmanned driving of an autonomous vehicle requires high reliability and excellent recognition performance of the road environment and driving situation. Since a single sensor cannot recognize various driving conditions precisely, a recognition system using only a single sensor is not suitable for autonomous driving due to the uncertainty of recognition. In this study, we have developed an autonomous vehicle using sensor fusion with radar, LIDAR and vision data that are coordinate-corrected by GPS and IMU. Deep learning and sensor fusion improves the recognition rate of stationary objects in the driving environment such as lanes, signs, and crosswalks, and accurately recognizes dynamic objects such as vehicles and pedestrians. Using a real road test, the unmanned autonomous driving technology developed in this research was verified to meet the reliability and stability requirements of the NHTSA level 3 autonomous standard. © ICROS 2019.
引用
收藏
页码:125 / 131
页数:6
相关论文
共 50 条
  • [21] An expandable multi-sensor data-fusion concept for autonomous driving in Urban environments
    Effertz, J.
    Journal of Aerospace Computing, Information and Communication, 2007, 4 (12): : 1108 - 1116
  • [22] Environmental perception and multi-sensor data fusion for off-road autonomous vehicles
    Xiang, ZY
    Özgüner, Ü
    2005 IEEE INTELLIGENT TRANSPORTATION SYSTEMS CONFERENCE (ITSC), 2005, : 584 - 589
  • [23] Multi-sensor data fusion approach for terrain aided navigation of Autonomous Underwater Vehicles
    Kalyan, B
    Balasuriya, AP
    OCEANS '04 MTS/IEEE TECHNO-OCEAN '04, VOLS 1- 2, CONFERENCE PROCEEDINGS, VOLS. 1-4, 2004, : 2013 - 2018
  • [24] Feature Map Transformation for Multi-sensor Fusion in Object Detection Networks for Autonomous Driving
    Schroder, Enrico
    Braun, Sascha
    Mahlisch, Mirko
    Vitay, Julien
    Hamker, Fred
    ADVANCES IN COMPUTER VISION, VOL 2, 2020, 944 : 118 - 131
  • [25] Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles
    Jahromi, Babak Shahian
    Tulabandhula, Theja
    Cetin, Sabri
    SENSORS, 2019, 19 (20)
  • [26] Multi-Sensor Fusion for Navigation and Mapping in Autonomous Vehicles: Accurate Localization in Urban Environments
    Li Qingqing
    Queralta, Jorge Pena
    Tuan Nguyen Gia
    Zhuo Zou
    Westerlund, Tomi
    UNMANNED SYSTEMS, 2020, 8 (03) : 229 - 237
  • [27] A Multi-Sensor Simulation Environment for Autonomous Cars
    Song, Rui
    Horridge, Paul
    Pemberton, Simon
    Wetherall, Jon
    Maskell, Simon
    Ralph, Jason
    2019 22ND INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2019), 2019,
  • [28] Research on object tracking and recognition based on multi-sensor fusion
    Chen Ying
    Sun Jian-fen
    Lei Liang
    Proceedings of the 2007 Chinese Control and Decision Conference, 2007, : 245 - 248
  • [29] OpenCalib: A multi-sensor calibration toolbox for autonomous driving
    Yan, Guohang
    Zhuochun, Liu
    Wang, Chengjie
    Shi, Chunlei
    Wei, Pengjin
    Cai, Xinyu
    Ma, Tao
    Liu, Zhizheng
    Zhong, Zebin
    Liu, Yuqian
    Zhao, Ming
    Ma, Zheng
    Li, Yikang
    SOFTWARE IMPACTS, 2022, 14
  • [30] Multi-sensor target recognition fusion based on fuzzy theory
    Han Feng
    Yang WanHai
    Yuan XiaoGuang
    ICEMI 2007: PROCEEDINGS OF 2007 8TH INTERNATIONAL CONFERENCE ON ELECTRONIC MEASUREMENT & INSTRUMENTS, VOL IV, 2007, : 64 - 68