The MADMAX data set for visual-inertial rover navigation on mars

被引:21
|
作者
Meyer, Lukas [1 ]
Smisek, Michal [1 ]
Villacampa, Alejandro Fontan [1 ]
Maza, Laura Oliva [1 ]
Medina, Daniel [2 ]
Schuster, Martin J. [1 ]
Steidle, Florian [1 ]
Vayugundla, Mallikarjuna [1 ]
Mueller, Marcus G. [1 ]
Rebele, Bernhard [3 ]
Wedler, Armin [1 ]
Triebel, Rudolph [1 ]
机构
[1] German Aerosp Ctr DLR, Inst Robot & Mechatron, Dept Percept & Cognit, Muenchener Str 20, D-82234 Wessling, Germany
[2] German Aerosp Ctr DLR, Inst Commun & Nav, Dept Naut Syst, Neustrelitz, Germany
[3] German Aerosp Ctr DLR, Inst Robot & Mechatron, Anal & Control Adv Robot Syst, Wessling, Germany
基金
欧盟地平线“2020”;
关键词
exploration; extreme environments; navigation; planetary robotics; SLAM;
D O I
10.1002/rob.22016
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planetary rovers increasingly rely on vision-based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human-portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource-efficient field testing and make the resulting Morocco-Acquired data set of Mars-Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time-stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real-time kinematic-based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state-of-the-art navigation algorithms, ORB-SLAM2 and VINS-mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at .
引用
收藏
页码:833 / 853
页数:21
相关论文
共 50 条
  • [31] A Combined Visual-Inertial Navigation System of MSCKF And EKF-SLAM
    Li, Jian
    Li, Qing
    Cheng, Nong
    2018 IEEE CSAA GUIDANCE, NAVIGATION AND CONTROL CONFERENCE (CGNCC), 2018,
  • [32] Visual-Inertial Tightly Coupled Fusion and Nonlinear Optimization for UAVs Navigation
    You, Zhenxing
    Cai, Zhihao
    Zhao, Jiang
    Zhang, Yu
    Wang, Yingxun
    PROCEEDINGS OF 2017 CHINESE INTELLIGENT AUTOMATION CONFERENCE, 2018, 458 : 741 - 750
  • [33] Decoupled Right Invariant Error States for Consistent Visual-Inertial Navigation
    Yang, Yulin
    Chen, Chuchu
    Lee, Woosik
    Huang, Guoquan
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2022, 7 (02) : 1627 - 1634
  • [34] A Linear-Complexity EKF for Visual-Inertial Navigation with Loop Closures
    Geneva, Patrick
    Eckenhoff, Kevin
    Huang, Guoquan
    2019 INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2019, : 3535 - 3541
  • [35] Multimodal Visual-Inertial Odometry for navigation in cold and low contrast environment
    Beauvisage, Axel
    Aouf, Nabil
    2017 EUROPEAN CONFERENCE ON MOBILE ROBOTS (ECMR), 2017,
  • [36] Visual-inertial navigation assisted by a single UWB anchor with an unknown position
    Haolong Luo
    Danping Zou
    Jiansheng Li
    Ancheng Wang
    Li Wang
    Zidi Yang
    Guangyun Li
    Satellite Navigation, 2025, 6 (1):
  • [37] Spatiotemporal Visibility- Driven Visual-Inertial SLAM for Remote Rover With Transmission Delay
    Xie, Hongle
    Chen, Weidong
    Wang, Jingchuan
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2022, 58 (03) : 1878 - 1893
  • [38] MARViN: Mobile AR Dataset with Visual-Inertial Data
    Liu, Changkun
    Zhao, Yukun
    Braud, Tristan
    2024 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS, VRW 2024, 2024, : 532 - 538
  • [39] Robocentric Visual-Inertial Odometry
    Huai, Zheng
    Huang, Guoquan
    2018 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2018, : 6319 - 6326
  • [40] Visual-Inertial Direct SLAM
    Concha, Alejo
    Loianno, Giuseppe
    Kumar, Vijay
    Civera, Javier
    2016 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2016, : 1331 - 1338