A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels

被引:0
|
作者
Mascarich, Frank [1 ]
Khattak, Shehryar [1 ]
Papachristos, Christos [1 ]
Alexis, Kostas [1 ]
机构
[1] Univ Nevada, Autonomous Robots Lab, Reno, NV 89557 USA
关键词
D O I
暂无
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The ability of autonomous exploration and mapping using aerial robots in GPS-denied dark visually-degraded environments is critical in multiple applications including those of inspection, exploration, search and rescue in tunnel environments, monitoring of mines, and mapping underground voids. This paper presents the design of a multi-modal mapping unit tailored to such missions. When combined with our previous work of the receding horizon volumetric exploration path planner, the unit enables robotic autonomy in such environments without any prior knowledge of the environment. The multi- modal mapping unit detailed in this work tightly synchronizes visible light cameras with inertial sensors as well as LEDs that flash only when the cameras' shutters are open. When used in conjunction with a visual-inertial odometry pipeline, the multi-modal mapping unit enables reliable robot navigation in darkness. With the further support of multiple miniature Time-of-Flight 3D depth sensors, dense and accurate maps are derived. The proposed system was evaluated using field experiments involving exploration and mapping of a railroad tunnel in conditions of darkness. Results show that a consistent and dense mapping of such challenging degraded visual environments was achieved.
引用
收藏
页数:7
相关论文
共 50 条
  • [1] Multi-modal mapping
    Yates, Darran
    NATURE REVIEWS NEUROSCIENCE, 2016, 17 (09) : 536 - 536
  • [2] Multi-modal mapping
    Darran Yates
    Nature Reviews Neuroscience, 2016, 17 : 536 - 536
  • [3] Multi-modal mobile sensor data fusion for autonomous robot mapping problem
    Kassem, M. H.
    Shehata, Omar M.
    Morgan, E. I. Imam
    2015 3RD INTERNATIONAL CONFERENCE ON CONTROL, MECHATRONICS AND AUTOMATION (ICCMA 2015), 2016, 42
  • [4] A Smoothing and Mapping Tightly Coupled Multi-modal Fusion Autonomous Navigation Method
    Zheng, Bo
    Han, Fei
    Cao, Tao
    Hu, Tao
    Peng, Yang
    Wang, Shisun
    2023 2ND CONFERENCE ON FULLY ACTUATED SYSTEM THEORY AND APPLICATIONS, CFASTA, 2023, : 1028 - 1033
  • [5] Multi-Modal Sonar Mapping of Offshore Cable Lines with an Autonomous Surface Vehicle
    Jung, Jongdae
    Lee, Yeongjun
    Park, Jeonghong
    Yeu, Tae-Kyeong
    JOURNAL OF MARINE SCIENCE AND ENGINEERING, 2022, 10 (03)
  • [6] EXPLORATION AND MAPPING OF ANCIENT UNDERGROUND TUNNELS WITH A HETEROGENEOUS ROBOT TEAM
    Savran, Dogan
    Tuna, Gurkan
    INFORMATICS, GEOINFORMATICS AND REMOTE SENSING CONFERENCE PROCEEDINGS, SGEM 2016, VOL II, 2016, : 297 - 304
  • [7] A multi-modal approach to perceptual tone mapping
    Ferradans, Sira
    Bertalmio, Marcelo
    Provenzi, Edoardo
    Caselles, Vicent
    2009 CONFERENCE FOR VISUAL MEDIA PRODUCTION: CVMP 2009, 2009, : 81 - 90
  • [8] Spatial mapping of multi-modal data in neuroscience
    Hawrylycz, Mike
    Sunkin, Susan
    Ng, Lydia
    METHODS, 2015, 73 : 1 - 3
  • [9] MEM: Multi-Modal Elevation Mapping for Robotics and Learning
    Erni, Gian
    Frey, Jonas
    Miki, Takahiro
    Mattamala, Matias
    Hutter, Marco
    2023 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2023, : 11011 - 11018
  • [10] maplab 2.0 - A Modular and Multi-Modal Mapping Framework
    Cramariuc, Andrei
    Bernreiter, Lukas
    Tschopp, Florian
    Fehr, Marius
    Reijgwart, Victor
    Nieto, Juan
    Siegwart, Roland
    Cadena, Cesar
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (02): : 520 - 527