Environment-Aware Sensor Fusion for Obstacle Detection

被引:0
|
作者
Romero, Adrian Rechy [1 ,2 ]
Borges, Paulo Vinicius Koerich [1 ]
Elfes, Alberto [1 ]
Pfrunder, Andreas [1 ,2 ]
机构
[1] CSIRO, Autonomous Syst Lab, Canberra, ACT, Australia
[2] Swiss Fed Inst Technol, Autonomous Syst Lab, Zurich, Switzerland
关键词
VISUAL TEACH; NAVIGATION; REPEAT;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Reliably detecting obstacles and identifying traversable areas is a key challenge in mobile robotics. For redundancy, information from multiple sensors is often fused. In this work we discuss how prior knowledge of the environment can improve the quality of sensor fusion, thereby increasing the performance of an obstacle detection module. We define a methodology to quantify the performance of obstacle detection sensors and algorithms. This information is used for environment-aware sensor fusion, where the fusion parameters are dependent on the past performance of each sensor in different parts of an operation site. The method is suitable for vehicles that operate in a known area, as is the case in many practical scenarios (warehouses, factories, mines, etc). The system is "trained" by manually driving the robot through a suitable trajectory along the operational areas of a site. The performance of a sensor configuration is then measured based on the similarity between the manually-driven trajectory and the trajectory that the path planner generates after detecting obstacles. Experiments are performed on an autonomous ground robot equipped with 2D laser sensors and a monocular camera with road detection capabilities. The results show an improvement in obstacle detection performance in comparison with a "naive" sensor fusion, illustrating the applicability of the method.
引用
收藏
页码:114 / 121
页数:8
相关论文
共 50 条
  • [41] Playing FPS Games With Environment-Aware Hierarchical Reinforcement Learning
    Song, Shihong
    Weng, Jiayi
    Su, Hang
    Yan, Dong
    Zou, Haosheng
    Zhu, Jun
    [J]. PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 3475 - 3482
  • [42] Directionally Resolved UWB Channel Modeling or Environment-Aware Positioning
    Rath, Michael
    Leitinger, Erik
    Anh Nguyen
    Witrisal, Klaus
    [J]. 2020 14TH EUROPEAN CONFERENCE ON ANTENNAS AND PROPAGATION (EUCAP 2020), 2020,
  • [43] Environment-Aware Regression for Indoor Localization Based on WiFi Fingerprinting
    Martin Mendoza-Silva, German
    Costa, Ana Cristina
    Torres-Sospedra, Joaquin
    Painho, Marco
    Huerta, Joaquin
    [J]. IEEE SENSORS JOURNAL, 2022, 22 (06) : 4978 - 4988
  • [44] En-MAC: Environment-Aware MAC Protocol for WSNs in Intertidal Environment
    Lai, Xiaohan
    Xu, Miao
    Ji, Xiaoyu
    Xu, Wenyuan
    Chen, Longdao
    [J]. 2017 9TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS AND SIGNAL PROCESSING (WCSP), 2017,
  • [45] EVSO: Environment-aware Video Streaming Optimization of Power Consumption
    Park, Kyoungjun
    Kim, Myungchul
    [J]. IEEE CONFERENCE ON COMPUTER COMMUNICATIONS (IEEE INFOCOM 2019), 2019, : 973 - 981
  • [46] App-Centric and Environment-Aware Monitoring and Diagnosis in the Cloud
    Carvalho, Tiago
    Kim, Hyong S.
    [J]. 2017 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS (ICC), 2017,
  • [47] Transformer-based Environment-aware Localization in the NLoS Scenarios
    Son, Jinwoo
    Keum, Inkook
    Kim, Hyunsoo
    Cho, Hyung Joon
    Shim, Byonghyo
    [J]. 2024 IEEE WIRELESS COMMUNICATIONS AND NETWORKING CONFERENCE, WCNC 2024, 2024,
  • [48] An environment-aware mobility model for wireless ad hoc network
    Ahmed, Sabbir
    Karmakar, Gour C.
    Kamruzzaman, Joarder
    [J]. COMPUTER NETWORKS, 2010, 54 (09) : 1470 - 1489
  • [49] Computing environment-aware agent behaviours with logic program updates
    Alferes, JJ
    Brogi, A
    Leite, JA
    Pereira, LM
    [J]. LOGIC BASED PROGRAM SYNTHESIS AND TRANSFORMATION, 2002, 2372 : 216 - 232
  • [50] Formation Control for an UAV Team With Environment-Aware Dynamic Constraints
    Hu, Zhongjun
    Jin, Xu
    [J]. IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2024, 9 (01): : 1465 - 1480