D2SLAM: Decentralized and Distributed Collaborative Visual-Inertial SLAM System for Aerial Swarm

被引:5
|
作者
Xu, Hao [1 ]
Liu, Peize [1 ]
Chen, Xinyi [1 ]
Shen, Shaojie [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
关键词
Simultaneous localization and mapping; Robots; Location awareness; State estimation; Accuracy; Optimization; Task analysis; Aerial systems: perception and autonomy; multirobot systems; simultaneous localization and mapping (SLAM); swarms; ROBUST; VERSATILE; IMAGE; SYNC;
D O I
10.1109/TRO.2024.3422003
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Collaborative simultaneous localization and mapping (CSLAM) is essential for autonomous aerial swarms, laying the foundation for downstream algorithms, such as planning and control. To address existing CSLAM systems' limitations in relative localization accuracy, crucial for close-range UAV collaboration, this article introduces D(2)SLAM-a novel decentralized and distributed CSLAM system. D(2)SLAM innovatively manages near-field estimation for precise relative state estimation in proximity and far-field estimation for consistent global trajectories. Its adaptable front-end supports both stereo and omnidirectional cameras, catering to various operational needs and overcoming field-of-view challenges in aerial swarms. Experiments demonstrate D(2)SLAM's effectiveness in accurate ego-motion estimation, relative localization, and global consistency. Enhanced by distributed optimization algorithms, D(2)SLAM exhibits remarkable scalability and resilience to network delays, making it well suited for a wide range of real-world aerial swarm applications. We believe the adaptability and proven performance of D(2)SLAM signify a notable advancement in autonomous aerial swarm technology.
引用
收藏
页码:3445 / 3464
页数:20
相关论文
共 50 条
  • [31] Visual-inertial SLAM in featureless environments on lunar surface
    Xie H.
    Chen W.
    Fan Y.
    Wang J.
    Hangkong Xuebao/Acta Aeronautica et Astronautica Sinica, 2021, 42 (01):
  • [32] Accurate Initialization Method for Monocular Visual-Inertial SLAM
    Amrani, Ahderraouf
    Wang, Hesheng
    2019 3RD INTERNATIONAL SYMPOSIUM ON AUTONOMOUS SYSTEMS (ISAS 2019), 2019, : 159 - 164
  • [33] Research on Visual-Inertial SLAM Technology with GNSS Assistance
    Zhao, Lin
    Wang, Xiaohan
    Zheng, Xiaoze
    Jia, Chun
    CHINA SATELLITE NAVIGATION CONFERENCE PROCEEDINGS, CSNC 2022, VOL II, 2022, 909 : 425 - 434
  • [34] Deep Depth Estimation from Visual-Inertial SLAM
    Sartipi, Kourosh
    Do, Tien
    Ke, Tong
    Vuong, Khiem
    Roumeliotis, Stergios, I
    2020 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2020, : 10038 - 10045
  • [35] SLAM for Direct Optimization based Visual-Inertial Fusion
    Schwaab, M.
    Brohammer, E.
    Manoli, Y.
    2018 DGON INERTIAL SENSORS AND SYSTEMS (ISS), 2018,
  • [36] Robust Indoor Visual-Inertial SLAM with Pedestrian Detection
    Zhang, Heng
    Huang, Ran
    Yuan, Liang
    2021 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (IEEE-ROBIO 2021), 2021, : 802 - 807
  • [37] BOEM-SLAM: A Block Online EM Algorithm for the Visual-Inertial SLAM Backend
    Chang, Tsang-Kai
    Pogue, Alexandra
    Mehta, Ankur
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 6420 - 6427
  • [38] COVINS-G: A Generic Back-end for Collaborative Visual-Inertial SLAM
    Patel, Manthan
    Karrer, Marco
    Baenninger, Philipp
    Chli, Margarita
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 2076 - 2082
  • [39] LDVI-SLAM: a lightweight monocular visual-inertial SLAM system for dynamic environments based on motion constraints
    Wang, Fenghua
    Zhao, Lengrui
    Xu, Zhicheng
    Liang, Hong
    Zhang, Qian
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2024, 35 (12)
  • [40] PLI-SLAM: A Tightly-Coupled Stereo Visual-Inertial SLAM System with Point and Line Features
    Teng, Zhaoyu
    Han, Bin
    Cao, Jie
    Hao, Qun
    Tang, Xin
    Li, Zhaoyang
    REMOTE SENSING, 2023, 15 (19)