The MADMAX data set for visual-inertial rover navigation on mars

被引:21
|
作者
Meyer, Lukas [1 ]
Smisek, Michal [1 ]
Villacampa, Alejandro Fontan [1 ]
Maza, Laura Oliva [1 ]
Medina, Daniel [2 ]
Schuster, Martin J. [1 ]
Steidle, Florian [1 ]
Vayugundla, Mallikarjuna [1 ]
Mueller, Marcus G. [1 ]
Rebele, Bernhard [3 ]
Wedler, Armin [1 ]
Triebel, Rudolph [1 ]
机构
[1] German Aerosp Ctr DLR, Inst Robot & Mechatron, Dept Percept & Cognit, Muenchener Str 20, D-82234 Wessling, Germany
[2] German Aerosp Ctr DLR, Inst Commun & Nav, Dept Naut Syst, Neustrelitz, Germany
[3] German Aerosp Ctr DLR, Inst Robot & Mechatron, Anal & Control Adv Robot Syst, Wessling, Germany
基金
欧盟地平线“2020”;
关键词
exploration; extreme environments; navigation; planetary robotics; SLAM;
D O I
10.1002/rob.22016
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planetary rovers increasingly rely on vision-based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human-portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource-efficient field testing and make the resulting Morocco-Acquired data set of Mars-Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time-stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real-time kinematic-based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state-of-the-art navigation algorithms, ORB-SLAM2 and VINS-mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at .
引用
下载
收藏
页码:833 / 853
页数:21
相关论文
共 50 条
  • [1] Visual-Inertial Based Autonomous Navigation
    Martins, Francisco de Babo
    Teixeira, Luis F.
    Nobrega, Rui
    ROBOT 2015: SECOND IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2, 2016, 418 : 561 - 572
  • [2] Towards Consistent Visual-Inertial Navigation
    Huang, Guoquan
    Kaess, Michael
    Leonard, John J.
    2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2014, : 4926 - 4933
  • [3] Visual-Inertial Navigation: A Concise Review
    Huang, Guoquan
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 9572 - 9582
  • [4] CHAMELEON: Visual-inertial indoor navigation
    Rydell, Joakim
    Emilsson, Erika
    2012 IEEE/ION POSITION LOCATION AND NAVIGATION SYMPOSIUM (PLANS), 2012, : 541 - 546
  • [5] Visual-Inertial Navigation with Guaranteed Convergence
    Di Corato, Francesco
    Innocenti, Mario
    Pollini, Lorenzo
    2013 IEEE WORKSHOP ON ROBOT VISION (WORV), 2013, : 152 - 157
  • [6] An Improved Monocular Visual-Inertial Navigation System
    Sun, Tian
    Liu, Yong
    Wang, Yujie
    Xiao, Zhen
    IEEE SENSORS JOURNAL, 2021, 21 (10) : 11728 - 11739
  • [7] Monocular Visual-Inertial Navigation for Dynamic Environment
    Fu, Dong
    Xia, Hao
    Qiao, Yanyou
    REMOTE SENSING, 2021, 13 (09)
  • [8] SIMULATION FRAMEWORK FOR A VISUAL-INERTIAL NAVIGATION SYSTEM
    Irmisch, Patrick
    Baumbach, Dirk
    Ernst, Ines
    Boerner, Anko
    2019 IEEE INTERNATIONAL CONFERENCE ON IMAGE PROCESSING (ICIP), 2019, : 1995 - 1999
  • [9] Attention and Anticipation in Fast Visual-Inertial Navigation
    Carlone, Luca
    Karaman, Sertac
    IEEE TRANSACTIONS ON ROBOTICS, 2019, 35 (01) : 1 - 20
  • [10] Visual-Inertial Navigation System Based on Virtual Inertial Sensors
    Cai, Yunpiao
    Qian, Weixing
    Zhao, Jiaqi
    Dong, Jiayi
    Shen, Tianxiao
    APPLIED SCIENCES-BASEL, 2023, 13 (12):