MARS-LVIG dataset: A multi-sensor aerial robots SLAM dataset for LiDAR-visual-inertial-GNSS fusion

被引:7
|
作者
Li, Haotian [1 ]
Zou, Yuying [1 ]
Chen, Nan [1 ]
Lin, Jiarong [1 ]
Liu, Xiyuan [1 ]
Xu, Wei [1 ]
Zheng, Chunran [1 ]
Li, Rundong [1 ]
He, Dongjiao [1 ]
Kong, Fanze [1 ]
Cai, Yixi [1 ]
Liu, Zheng [1 ]
Zhou, Shunbo [2 ]
Xue, Kaiwen [2 ]
Zhang, Fu [1 ,3 ]
机构
[1] Univ Hong Kong, Dept Mech Engn, Hong Kong, Peoples R China
[2] Huawei Cloud Comp Technol Co Ltd, Huawei Cloud Comp Tech Innovat Dept, Guian, Peoples R China
[3] Univ Hong Kong, Dept Mechan Engn, Mechatron & Robot Syst Lab, Pokfulam, HW 7-18, Hong Kong, Peoples R China
来源
INTERNATIONAL JOURNAL OF ROBOTICS RESEARCH | 2024年 / 43卷 / 08期
关键词
Dataset; aerial robots; multi-sensor fusion; LiDAR; camera; Simultaneous Localization and Mapping; Global Navigation Satellite System; Inertial Measurement Unit; URBAN DATASET; ROBUST; LOCALIZATION; BENCHMARK; ODOMETRY; CAMERA; READY;
D O I
10.1177/02783649241227968
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
In recent years, advancements in Light Detection and Ranging (LiDAR) technology have made 3D LiDAR sensors more compact, lightweight, and affordable. This progress has spurred interest in integrating LiDAR with sensors such as Inertial Measurement Units (IMUs) and cameras for Simultaneous Localization and Mapping (SLAM) research. Public datasets covering different scenarios, platforms, and viewpoints are crucial for multi-sensor fusion SLAM studies, yet most focus on handheld or vehicle-mounted devices with front or 360-degree views. Data from aerial vehicles with downward-looking views is scarce, existing relevant datasets usually feature low altitudes and are mostly limited to small campus environments. To fill this gap, we introduce the Multi-sensor Aerial Robots SLAM dataset (MARS-LVIG dataset), providing unique aerial downward-looking LiDAR-Visual-Inertial-GNSS data with viewpoints from altitudes between 80 m and 130 m. The dataset not only offers new aspects to test and evaluate existing SLAM algorithms, but also brings new challenges which can facilitate researches and developments of more advanced SLAM algorithms. The MARS-LVIG dataset contains 21 sequences, acquired across diversified large-area environments including an aero-model airfield, an island, a rural town, and a valley. Within these sequences, the UAV has speeds varying from 3 m/s to 12 m/s, a scanning area reaching up to 577,000 m(2), and the max path length of 7.148 km in a single flight. This dataset encapsulates data collected by a lightweight, hardware-synchronized sensor package that includes a solid-state 3D LiDAR, a global-shutter RGB camera, IMUs, and a raw message receiver of the Global Navigation Satellite System (GNSS). For algorithm evaluation, this dataset releases ground truth of both localization and mapping, which are acquired by on-board Real-time Kinematic (RTK) and DJI L1 (post-processed by its supporting software DJI Terra), respectively. The dataset can be downloaded from: https://mars.hku.hk/dataset.html.
引用
收藏
页码:1114 / 1127
页数:14
相关论文
共 34 条
  • [21] Multi-Sensor 6-DoF Localization For Aerial Robots In Complex GNSS-Denied Environments
    Paneque, J. L.
    Martinez-de Dios, J. R.
    Ollero, A.
    2019 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2019, : 1978 - 1984
  • [22] A Smart Helmet Framework Based on Visual-Inertial SLAM and Multi-Sensor Fusion to Improve Situational Awareness and Reduce Hazards in Mountaineering
    Tan, Charles Shi
    INTERNATIONAL JOURNAL OF SOFTWARE SCIENCE AND COMPUTATIONAL INTELLIGENCE-IJSSCI, 2023, 15 (01):
  • [23] Industrial Environment Multi-Sensor Dataset for Vehicle Indoor Tracking with Wi-Fi, Inertial and Odometry Data
    Silva, Ivo
    Pendao, Cristiano
    Torres-Sospedra, Joaquin
    Moreira, Adriano
    DATA, 2023, 8 (10)
  • [24] Design of visual inertial state estimator for autonomous systems via multi-sensor fusion approach
    He, Shenghuang
    Li, Yanzhou
    Lu, Yongkang
    Liu, Yishan
    MECHATRONICS, 2023, 95
  • [25] Vehicle Detection and Attribution from a Multi-Sensor Dataset Using a Rule-Based Approach Combined with Data Fusion
    Bowman, Lindsey A.
    Narayanan, Ram M.
    Kane, Timothy J.
    Bradley, Eliza S.
    Baran, Matthew S.
    SENSORS, 2023, 23 (21)
  • [26] IFAL-SLAM: an approach to inertial-centered multi-sensor fusion, factor graph optimization, and adaptive Lagrangian method
    Liu, Jiaming
    Qi, Yongsheng
    Yuan, Guoshuai
    Liu, Liqiang
    Li, Yongting
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (01)
  • [27] Multi-unmanned Aerial Vehicle Pose Estimation Based on Visual-inertial-range Sensor Fusion
    Choi J.
    Marsim K.C.
    Jeong M.
    Ryoo K.
    Kim J.
    Myung H.
    Journal of Institute of Control, Robotics and Systems, 2023, 29 (11) : 859 - 865
  • [28] Towards Smarter Positioning through Analyzing Raw GNSS and Multi-Sensor Data from Android Devices: A Dataset and an Open-Source Application
    Grenier, Antoine
    Lohan, Elena Simona
    Ometov, Aleksandr
    Nurmi, Jari
    ELECTRONICS, 2023, 12 (23)
  • [29] Improved Multi-Sensor Fusion Positioning System Based on GNSS/LiDAR/Vision/IMU With Semi-Tight Coupling and Graph Optimization in GNSS Challenging Environments
    Zhu, Jiaming
    Zhou, Han
    Wang, Ziyi
    Yang, Suli
    IEEE ACCESS, 2023, 11 : 95711 - 95723
  • [30] Underwater multi-sensor fusion localization with visual-inertial-depth using hybrid residuals and efficient loop closing
    Ding, Shuoshuo
    Zhang, Tiedong
    Li, Ye
    Xu, Shuo
    Lei, Ming
    MEASUREMENT, 2024, 238