Real-time localization measure and perception detection using multi-sensor fusion for Automated Guided Vehicles

被引:0
|
作者
Song, Di [1 ]
Tian, Guang-Mao [1 ]
Liu, Jiaqi [1 ]
机构
[1] Harbin Univ Sci & Technol, Sch Automat, Harbin 150080, Peoples R China
关键词
multi-sensor fusion; fully convolutional neural network; Kalman filter; Automated Guided Vehicles; EXTENDED KALMAN; LASER SCANNER; ODOMETRY;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automated Guided Vehicles (AGVs) need to localize themselves reliably and perceive environment accurately to perform their tasks efficiently. To that end, they rely on noisy sensor measurements that potentially provide erroneous estimates if they are used directly. To prevent this issue, measurements from different kinds of sensors are generally used together. This paper presents a hybrid multi-sensor fusion pipeline configuration that can function with asynchronous measurements from camera, LiDAR, odometry and Inertial Measurement Units (IMUs). The hybrid multi-sensor fusion algorithm consists of two parts that run in parallel, one of them is the combination of camera and LiDAR with a Fully Convolutional Neural Network (FCNx) architecture for performing classification and segmentation, the other one is to deal with the task of detecting objects and tracking states by using Kalman filter to fusion odometry and IMUs sensors. The developed algorithm was tested with an open-source multi-sensor navigation dataset and real-time experiments with an AGV. It was found that using sensor fusion resulted in a smaller deviation from the actual trajectory compared to using only a laser scanner. Furthermore, in each experiment, using sensor fusion decreased the localization error in the periods where the laser was unavailable, although the amount of improvement depended on the duration of unavailability and motion characteristics.
引用
收藏
页码:3219 / 3224
页数:6
相关论文
共 50 条
  • [1] Real-Time Hybrid Multi-Sensor Fusion Framework for Perception in Autonomous Vehicles
    Jahromi, Babak Shahian
    Tulabandhula, Theja
    Cetin, Sabri
    [J]. SENSORS, 2019, 19 (20)
  • [2] Real-Time Vehicles Tracking Based on Mobile Multi-Sensor Fusion
    Plangi, Siim
    Hadachi, Amnir
    Lind, Artjom
    Bensrhair, Abdelaziz
    [J]. IEEE SENSORS JOURNAL, 2018, 18 (24) : 10077 - 10084
  • [3] A Real-Time Multi-Sensor Fusion Platform for Automated Driving Application Development
    Bijlsma, Tjerk
    Kwakkernaat, Maurice
    Mnatsakanyan, Mari
    [J]. PROCEEDINGS 2015 IEEE INTERNATIONAL CONFERENCE ON INDUSTRIAL INFORMATICS (INDIN), 2015, : 1372 - 1377
  • [4] Multi-sensor fusion for real-time object tracking
    Verma, Sakshi
    Singh, Vishal K. K.
    [J]. MULTIMEDIA TOOLS AND APPLICATIONS, 2024, 83 (07) : 19563 - 19585
  • [5] Multi-sensor fusion for real-time object tracking
    Sakshi Verma
    Vishal K. Singh
    [J]. Multimedia Tools and Applications, 2024, 83 : 19563 - 19585
  • [6] Indoor localization for pedestrians with real-time capability using multi-sensor smartphones
    Ehrlich, Catia Real
    Blankenbach, Joerg
    [J]. GEO-SPATIAL INFORMATION SCIENCE, 2019, 22 (02) : 73 - 88
  • [7] A Real-Time Map Refinement Method Using a Multi-Sensor Localization Framework
    Delobel, Laurent
    Aufrere, Romuald
    Debain, Christophe
    Chapuis, Roland
    Chateau, Thierry
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2019, 20 (05) : 1644 - 1658
  • [8] Multi-Sensor Data Fusion for Real-Time Surface Quality Control in Automated Machining Systems
    Garcia Plaza, E.
    Nunez Lopez, P. J.
    Beamud Gonzalez, E. M.
    [J]. SENSORS, 2018, 18 (12)
  • [9] Real-Time Sensor Anomaly Detection and Identification in Automated Vehicles
    van Wyk, Franco
    Wang, Yiyang
    Khojandi, Anahita
    Masoud, Neda
    [J]. IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS, 2020, 21 (03) : 1264 - 1276
  • [10] Real-time multi-sensor based vehicle detection using MINACE filters
    Topiwala, Pankaj
    Nehemiah, Avinash
    [J]. OPTICAL PATTERN RECOGNITION XVIII, 2007, 6574