CORB2I-SLAM: An Adaptive Collaborative Visual-Inertial SLAM for Multiple Robots

被引:8
|
作者
Saha, Arindam [1 ]
Dhara, Bibhas Chandra [1 ]
Umer, Saiyed [2 ]
AlZubi, Ahmad Ali [3 ]
Alanazi, Jazem Mutared [4 ]
Yurii, Kulakov [5 ]
机构
[1] Jadavpur Univ, Dept Informat Technol, Kolkata 700098, India
[2] Aliah Univ, Dept Comp Sci & Engn, Kolkata 700156, India
[3] King Saud Univ, Comp Sci Dept, Riyadh 11437, Saudi Arabia
[4] King Saud Univ, Community Coll, Comp Sci Dept, Riyadh 11437, Saudi Arabia
[5] Natl Tech Univ Ukraine, Dept Comp Engn, Igor Sikorsky Kyiv Polytech Inst, UA-03056 Kiev, Ukraine
关键词
Visual-Inertial Odometry; visual-inertial SLAM; collaborative SLAM; multi-map SLAM; client-server architecture; heterogeneous camera configuration; ROBUST; LOCALIZATION; EXTRACTION;
D O I
10.3390/electronics11182814
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The generation of robust global maps of an unknown cluttered environment through a collaborative robotic framework is challenging. We present a collaborative SLAM framework, CORB2I-SLAM, in which each participating robot carries a camera (monocular/stereo/RGB-D) and an inertial sensor to run odometry. A centralized server stores all the maps and executes processor-intensive tasks, e.g., loop closing, map merging, and global optimization. The proposed framework uses well-established Visual-Inertial Odometry (VIO), and can be adapted to use Visual Odometry (VO) when the measurements from inertial sensors are noisy. The proposed system solves certain disadvantages of odometry-based systems such as erroneous pose estimation due to incorrect feature selection or losing track due to abrupt camera motion and provides a more accurate result. We perform feasibility tests on real robot autonomy and extensively validate the accuracy of CORB2I-SLAM on benchmark data sequences. We also evaluate its scalability and applicability in terms of the number of participating robots and network requirements, respectively.
引用
收藏
页数:15
相关论文
共 50 条
  • [41] VI-NeRF-SLAM: a real-time visual-inertial SLAM with NeRF mapping
    Liao, Daoqing
    Ai, Wei
    JOURNAL OF REAL-TIME IMAGE PROCESSING, 2024, 21 (02)
  • [42] LL-VI SLAM: enhanced visual-inertial SLAM for low-light environments
    Ma, Tianbing
    Li, Liang
    Du, Fei
    Shu, Jinxin
    Li, Changpeng
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (01)
  • [43] A Robust Parallel Initialization Method for Monocular Visual-Inertial SLAM
    Zhong, Min
    Yao, Yiqing
    Xu, Xiaosu
    Wei, Hongyu
    SENSORS, 2022, 22 (21)
  • [44] Visual-Inertial SLAM for a Small Helicopter in Large Outdoor Environments
    Achtelik, Markus W.
    Lynen, Simon
    Weiss, Stephan
    Kneip, Laurent
    Chli, Margarita
    Siegwart, Roland
    2012 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2012, : 2651 - +
  • [45] Monocular Visual-Inertial SLAM: Continuous Preintegration and Reliable Initialization
    Liu, Yi
    Chen, Zhong
    Zheng, Wenjuan
    Wang, Hao
    Liu, Jianguo
    SENSORS, 2017, 17 (11)
  • [46] GO-SLAM: GPS-Aided Visual-Inertial SLAM for Adaptive UAV Navigation in Outdoor-Indoor Environments
    Wicaksono, Muhammad
    Shin, Soo Young
    IEEE TRANSACTIONS ON INSTRUMENTATION AND MEASUREMENT, 2025, 74
  • [47] Scale Estimation and Refinement in Monocular Visual-Inertial SLAM System
    Mu, Xufu
    Chen, Jing
    Leng, Zhen
    Lin, Songnan
    Huang, Ningsheng
    IMAGE AND GRAPHICS (ICIG 2017), PT I, 2017, 10666 : 533 - 544
  • [48] Integration of Sonar and Visual-Inertial Systems for SLAM in Underwater Environments
    Zhang, Jiawei
    Han, Fenglei
    Han, Duanfeng
    Yang, Jianfeng
    Zhao, Wangyuan
    Li, Hansheng
    IEEE SENSORS JOURNAL, 2024, 24 (10) : 16792 - 16804
  • [49] A Look at Improving Robustness in Visual-inertial SLAM by Moment Matching
    Solin, Arno
    Li, Rui
    Pilzer, Andrea
    2022 25TH INTERNATIONAL CONFERENCE ON INFORMATION FUSION (FUSION 2022), 2022,
  • [50] Keyframe-Based Visual-Inertial Online SLAM with Relocalization
    Kasyanov, Anton
    Engelmann, Francis
    Stueckler, Joerg
    Leibe, Bastian
    2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2017, : 6662 - 6669