A Multi-Modal Mapping Unit for Autonomous Exploration and Mapping of Underground Tunnels

被引:0
|
作者
Mascarich, Frank [1 ]
Khattak, Shehryar [1 ]
Papachristos, Christos [1 ]
Alexis, Kostas [1 ]
机构
[1] Univ Nevada, Autonomous Robots Lab, Reno, NV 89557 USA
关键词
D O I
暂无
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
The ability of autonomous exploration and mapping using aerial robots in GPS-denied dark visually-degraded environments is critical in multiple applications including those of inspection, exploration, search and rescue in tunnel environments, monitoring of mines, and mapping underground voids. This paper presents the design of a multi-modal mapping unit tailored to such missions. When combined with our previous work of the receding horizon volumetric exploration path planner, the unit enables robotic autonomy in such environments without any prior knowledge of the environment. The multi- modal mapping unit detailed in this work tightly synchronizes visible light cameras with inertial sensors as well as LEDs that flash only when the cameras' shutters are open. When used in conjunction with a visual-inertial odometry pipeline, the multi-modal mapping unit enables reliable robot navigation in darkness. With the further support of multiple miniature Time-of-Flight 3D depth sensors, dense and accurate maps are derived. The proposed system was evaluated using field experiments involving exploration and mapping of a railroad tunnel in conditions of darkness. Results show that a consistent and dense mapping of such challenging degraded visual environments was achieved.
引用
收藏
页数:7
相关论文
共 50 条
  • [21] Strategies for Multi-Modal Scene Exploration
    Bohg, Jeannette
    Johnson-Roberson, Matthew
    Bjorkman, Marten
    Kragic, Danica
    IEEE/RSJ 2010 INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS 2010), 2010, : 4509 - 4515
  • [22] Autonomous Exploration and Mapping of Unknown Environments
    Owens, Jason
    Osteen, Phil
    Fields, MaryAnne
    UNMANNED SYSTEMS TECHNOLOGY XIV, 2012, 8387
  • [23] Orientation-Aware Multi-Modal Learning for Road Intersection Identification and Mapping
    He, Qibin
    Xiao, Zhongyang
    Huang, Ze
    Yuan, Hongyuan
    Sun, Li
    2024 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA 2024), 2024, : 16185 - 16191
  • [24] Multi-Modal Detection and Mapping of Static and Dynamic Obstacles in Agriculture for Process Evaluation
    Korthals, Timo
    Kragh, Mikkel
    Christiansen, Peter
    Karstoft, Henrik
    Jorgensen, Rasmus N.
    Ruckert, Ulrich
    FRONTIERS IN ROBOTICS AND AI, 2018, 5
  • [25] Autonomous exploration and mapping in an unknown environment
    Ip, YL
    Rad, AB
    Wong, YK
    PROCEEDINGS OF THE 2004 INTERNATIONAL CONFERENCE ON MACHINE LEARNING AND CYBERNETICS, VOLS 1-7, 2004, : 4194 - 4199
  • [26] An integrated software solution for multi-modal mapping of morphological and functional ocular data
    Troeger, E.
    Sliesoraityte, I.
    Issa, P. Charbel
    Scholl, H. P. N.
    Zrenner, E.
    Wilke, R.
    2010 ANNUAL INTERNATIONAL CONFERENCE OF THE IEEE ENGINEERING IN MEDICINE AND BIOLOGY SOCIETY (EMBC), 2010, : 6280 - 6283
  • [27] Multi-Modal Intraoperative Electrophysiological Mapping Of The Anterior Nucleus Of The Thalamus In Refractory Epilepsy
    Imbach, L. L.
    Baumann, C.
    Oertel, M.
    Stieglitz, L.
    EPILEPSIA, 2019, 60 : 101 - 101
  • [28] Exploration and Mapping with Autonomous Robot Teams
    Olson, Edwin
    Strom, Johannes
    Goedel, Rob
    Morton, Ryan
    Ranganathan, Pradep
    Richardson, Andrew
    COMMUNICATIONS OF THE ACM, 2013, 56 (03) : 62 - 70
  • [29] Autonomous Exploration and Mapping of Flooded Sinkholes
    Fairfield, Nathaniel
    Kantor, George
    Jonak, Dominic
    Wettergreen, David
    INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH, 2010, 29 (06): : 748 - 774
  • [30] A Multi-modal Sensor Array for Safe Human-Robot Interaction and Mapping
    Abah, Colette
    Orekhov, Andrew L.
    Johnston, Garrison L. H.
    Yin, Peng
    Choset, Howie
    Simaan, Nabil
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 3768 - 3774