D2SLAM: Decentralized and Distributed Collaborative Visual-Inertial SLAM System for Aerial Swarm

被引:5
|
作者
Xu, Hao [1 ]
Liu, Peize [1 ]
Chen, Xinyi [1 ]
Shen, Shaojie [1 ]
机构
[1] Hong Kong Univ Sci & Technol, Dept Elect & Comp Engn, Hong Kong, Peoples R China
关键词
Simultaneous localization and mapping; Robots; Location awareness; State estimation; Accuracy; Optimization; Task analysis; Aerial systems: perception and autonomy; multirobot systems; simultaneous localization and mapping (SLAM); swarms; ROBUST; VERSATILE; IMAGE; SYNC;
D O I
10.1109/TRO.2024.3422003
中图分类号
TP24 [机器人技术];
学科分类号
080202 ; 1405 ;
摘要
Collaborative simultaneous localization and mapping (CSLAM) is essential for autonomous aerial swarms, laying the foundation for downstream algorithms, such as planning and control. To address existing CSLAM systems' limitations in relative localization accuracy, crucial for close-range UAV collaboration, this article introduces D(2)SLAM-a novel decentralized and distributed CSLAM system. D(2)SLAM innovatively manages near-field estimation for precise relative state estimation in proximity and far-field estimation for consistent global trajectories. Its adaptable front-end supports both stereo and omnidirectional cameras, catering to various operational needs and overcoming field-of-view challenges in aerial swarms. Experiments demonstrate D(2)SLAM's effectiveness in accurate ego-motion estimation, relative localization, and global consistency. Enhanced by distributed optimization algorithms, D(2)SLAM exhibits remarkable scalability and resilience to network delays, making it well suited for a wide range of real-world aerial swarm applications. We believe the adaptability and proven performance of D(2)SLAM signify a notable advancement in autonomous aerial swarm technology.
引用
收藏
页码:3445 / 3464
页数:20
相关论文
共 50 条
  • [41] An Efficient Schmidt-EKF for 3D Visual-Inertial SLAM
    Geneva, Patrick
    Maley, James
    Huang, Guoquan
    2019 IEEE/CVF CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION (CVPR 2019), 2019, : 12097 - 12107
  • [42] Visual-Inertial RGB-D SLAM With Encoders for a Differential Wheeled Robot
    Zhu, Zhanghao
    Kaizu, Yutaka
    Furuhashi, Kenichi
    Imou, Kenji
    IEEE SENSORS JOURNAL, 2022, 22 (06) : 5360 - 5371
  • [43] DCL-SLAM: A Distributed Collaborative LiDAR SLAM Framework for a Robotic Swarm
    Zhong, Shipeng
    Qi, Yuhua
    Chen, Zhiqiang
    Wu, Jin
    Chen, Hongbo
    Liu, Ming
    IEEE SENSORS JOURNAL, 2024, 24 (04) : 4786 - 4797
  • [44] Visual and Visual-Inertial SLAM: State of the Art, Classification, and Experimental Benchmarking
    Servieres, Myriam
    Renaudin, Valerie
    Dupuis, Alexis
    Antigny, Nicolas
    JOURNAL OF SENSORS, 2021, 2021
  • [45] Visual and Visual-Inertial SLAM: State of the Art, Classification, and Experimental Benchmarking
    Centrale Nantes, Nantes
    44321, France
    不详
    不详
    44321, France
    不详
    44344, France
    Servières, Myriam (myriam.servieres@ec-nantes.fr), 1600, Hindawi Limited (2021):
  • [46] Dynam-SLAM: An Accurate, Robust Stereo Visual-Inertial SLAM Method in Dynamic Environments
    Yin, Hesheng
    Li, Shaomiao
    Tao, Yu
    Guo, Junlong
    Huang, Bo
    IEEE TRANSACTIONS ON ROBOTICS, 2023, 39 (01) : 289 - 308
  • [47] Periodic SLAM: Using Cyclic Constraints to Improve the Performance of Visual-Inertial SLAM on Legged Robots
    Kumar, Hans
    Payne, J. Joe
    Travers, Matthew
    Johnson, Aaron M.
    Choset, Howie
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9477 - 9483
  • [48] Dynam-SLAM: An Accurate, Robust Stereo Visual-Inertial SLAM Method in Dynamic Environments
    Yin, Hesheng
    Li, Shaomiao
    Tao, Yu
    Guo, Junlong
    Huang, Bo
    IEEE TRANSACTIONS ON ROBOTICS, 2022,
  • [49] VI-NeRF-SLAM: a real-time visual-inertial SLAM with NeRF mapping
    Liao, Daoqing
    Ai, Wei
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2024, 21 (02)
  • [50] LL-VI SLAM: enhanced visual-inertial SLAM for low-light environments
    Ma, Tianbing
    Li, Liang
    Du, Fei
    Shu, Jinxin
    Li, Changpeng
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (01)