The MADMAX data set for visual-inertial rover navigation on mars

被引:21
|
作者
Meyer, Lukas [1 ]
Smisek, Michal [1 ]
Villacampa, Alejandro Fontan [1 ]
Maza, Laura Oliva [1 ]
Medina, Daniel [2 ]
Schuster, Martin J. [1 ]
Steidle, Florian [1 ]
Vayugundla, Mallikarjuna [1 ]
Mueller, Marcus G. [1 ]
Rebele, Bernhard [3 ]
Wedler, Armin [1 ]
Triebel, Rudolph [1 ]
机构
[1] German Aerosp Ctr DLR, Inst Robot & Mechatron, Dept Percept & Cognit, Muenchener Str 20, D-82234 Wessling, Germany
[2] German Aerosp Ctr DLR, Inst Commun & Nav, Dept Naut Syst, Neustrelitz, Germany
[3] German Aerosp Ctr DLR, Inst Robot & Mechatron, Anal & Control Adv Robot Syst, Wessling, Germany
基金
欧盟地平线“2020”;
关键词
exploration; extreme environments; navigation; planetary robotics; SLAM;
D O I
10.1002/rob.22016
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Planetary rovers increasingly rely on vision-based components for autonomous navigation and mapping. Developing and testing these components requires representative optical conditions, which can be achieved by either field testing at planetary analog sites on Earth or using prerecorded data sets from such locations. However, the availability of representative data is scarce and field testing in planetary analog sites requires a substantial financial investment and logistical overhead, and it entails the risk of damaging complex robotic systems. To address these issues, we use our compact human-portable DLR Sensor Unit for Planetary Exploration Rovers (SUPER) in the Moroccan desert to show resource-efficient field testing and make the resulting Morocco-Acquired data set of Mars-Analog eXploration (MADMAX) publicly accessible. The data set consists of 36 different navigation experiments, captured at eight Mars analog sites of widely varying environmental conditions. Its longest trajectory covers 1.5 km and the combined trajectory length is 9.2 km. The data set contains time-stamped recordings from monochrome stereo cameras, a color camera, omnidirectional cameras in stereo configuration, and from an inertial measurement unit. Additionally, we provide the ground truth in position and orientation together with the associated uncertainties, obtained by a real-time kinematic-based algorithm that fuses the global navigation satellite system data of two body antennas. Finally, we run two state-of-the-art navigation algorithms, ORB-SLAM2 and VINS-mono, on our data to evaluate their accuracy and to provide a baseline, which can be used as a performance reference of accuracy and robustness for other navigation algorithms. The data set can be accessed at .
引用
下载
收藏
页码:833 / 853
页数:21
相关论文
共 50 条
  • [21] Autonomous aerial navigation using monocular visual-inertial fusion
    Lin, Yi
    Gao, Fei
    Qin, Tong
    Gao, Wenliang
    Liu, Tianbo
    Wu, William
    Yang, Zhenfei
    Shen, Shaojie
    JOURNAL OF FIELD ROBOTICS, 2018, 35 (01) : 23 - 51
  • [22] Photometric Visual-Inertial Navigation With Uncertainty-Aware Ensembles
    Jung, Jae Hyung
    Choe, Yeongkwon
    Park, Chan Gook
    IEEE TRANSACTIONS ON ROBOTICS, 2022, 38 (04) : 2039 - 2052
  • [23] EPVC: a novel initialization approach of visual-inertial integrated navigation
    Gu, Xiaobo
    Zhou, Yujie
    Luo, Dongxiang
    Li, Zeyu
    Measurement Science and Technology, 2025, 36 (01)
  • [24] Mars Rover Autonomous Navigation
    M. Maurette
    Autonomous Robots, 2003, 14 : 199 - 208
  • [25] Mars rover autonomous navigation
    Maurette, M
    AUTONOMOUS ROBOTS, 2003, 14 (2-3) : 199 - 208
  • [26] Learning Pose Estimation for UAV Autonomous Navigation and Landing Using Visual-Inertial Sensor Data
    Baldini, Francesca
    Anandkumar, Animashree
    Murray, Richard M.
    2020 AMERICAN CONTROL CONFERENCE (ACC), 2020, : 2961 - 2966
  • [27] Performance Analysis of Visual-Inertial Navigation System with Feature Track Parameters
    Jung, Jae Hyung
    Lee, Hanyeol
    Park, Chan Gook
    2022 22ND INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2022), 2022, : 1788 - 1791
  • [28] Visual-Inertial Navigation Systems for Aerial Robotics: Sensor Fusion and Technology
    Santoso, Fendy
    Garratt, Matthew A.
    Anavatti, Sreenatha G.
    IEEE TRANSACTIONS ON AUTOMATION SCIENCE AND ENGINEERING, 2017, 14 (01) : 260 - 275
  • [29] A Low-Altitude UAV Dataset Based on Visual-Inertial Navigation
    Lyu, Pin
    Yong, Chengyou
    Lai, Jizhou
    Yuan, Cheng
    Zhu, Qianqian
    Han, Adong
    IEEE Sensors Journal, 2024, 24 (24) : 41904 - 41911
  • [30] Covariance Estimation for Pose Graph Optimization in Visual-Inertial Navigation Systems
    Shi, Pengcheng
    Zhu, Zhikai
    Sun, Shiying
    Rong, Zheng
    Zhao, Xiaoguang
    Tan, Min
    IEEE TRANSACTIONS ON INTELLIGENT VEHICLES, 2023, 8 (06): : 3657 - 3667