TransFusionOdom: Transformer-Based LiDAR-Inertial Fusion Odometry Estimation

被引:14
|
作者
Sun, Leyuan [1 ,2 ]
Ding, Guanqun [3 ]
Qiu, Yue [2 ]
Yoshiyasu, Yusuke [2 ]
Kanehiro, Fumio [1 ,4 ]
机构
[1] Natl Inst Adv Ind Sci & Technol, CNRS AIST Joint Robot Lab JRL, IRL, Tsukuba 3058560, Japan
[2] Natl Inst Adv Ind Sci & Technol, Comp Vis Res Team, Artificial Intelligence Res Ctr AIRC, Tsukuba 3058560, Japan
[3] Natl Inst Adv Ind Sci & Technol, Digital Architecture Res Ctr DigiARC, Tokyo 1350064, Japan
[4] Univ Tsukuba, Grad Sch Sci & Technol, Dept Intelligent & Mech Interact Syst, Tsukuba 3050006, Japan
基金
日本学术振兴会;
关键词
Attention mechanisms; LiDAR-inertial odometry (LIO); multimodal learning; sensor data fusion; transformer; ROBUST; DEPTH; CNN;
D O I
10.1109/JSEN.2023.3302401
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multimodal fusion of sensors is a commonly used approach to enhance the performance of odometry estimation, which is also a fundamental module for mobile robots. Recently, learning-based approaches garner the attention in this field, due to their robust nonhandcrafted designs. However, the question of How to perform fusion among different modalities in a supervised sensor fusion odometry estimation task? is one of the challenging issues still remaining. Some simple operations, such as elementwise summation and concatenation, are not capable of assigning adaptive attentional weights to incorporate different modalities efficiently, which makes it difficult to achieve competitive odometry results. Besides, the Transformer architecture has shown potential for multimodal fusion tasks, particularly in the domains of vision with language. In this work, we propose an end-to-end supervised Transformer-based LiDAR-Inertial fusion framework (namely TransFusionOdom) for odometry estimation. The multiattention fusion module demonstrates different fusion approaches for homogeneous and heterogeneous modalities to address the overfitting problem that can arise from blindly increasing the complexity of the model. Additionally, to interpret the learning process of the Transformer-based multimodal interactions, a general visualization approach is introduced to illustrate the interactions between modalities. Moreover, exhaustive ablation studies evaluate different multimodal fusion strategies to verify the performance of the proposed fusion strategy. A synthetic multimodal dataset is made public to validate the generalization ability of the proposed fusion strategy, which also works for other combinations of different modalities. The quantitative and qualitative odometry evaluations on the KITTI dataset verify that the proposed TransFusionOdom can achieve superior performance compared with other learning-based related works.
引用
收藏
页码:22064 / 22079
页数:16
相关论文
共 50 条
  • [1] UnDeepLIO: Unsupervised Deep Lidar-Inertial Odometry
    Tu, Yiming
    Xie, Jin
    PATTERN RECOGNITION, ACPR 2021, PT II, 2022, 13189 : 189 - 202
  • [2] A LiDAR-inertial Odometry with Principled Uncertainty Modeling
    Jiang, Binqian
    Shen, Shaojie
    2022 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), 2022, : 13292 - 13299
  • [3] FMCW-LIO: A Doppler LiDAR-Inertial Odometry
    Zhao, Mingle
    Wang, Jiahao
    Gao, Tianxiao
    Xu, Chengzhong
    Kong, Hui
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (06): : 5727 - 5734
  • [4] LOG-LIO: A LiDAR-Inertial Odometry With Efficient Local Geometric Information Estimation
    Huang, Kai
    Zhao, Junqiao
    Zhu, Zhongyang
    Ye, Chen
    Feng, Tiantian
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2024, 9 (01) : 459 - 466
  • [5] Factor Graph Accelerator for LiDAR-Inertial Odometry (Invited Paper)
    Hao, Yuhui
    Yu, Bo
    Liu, Qiang
    Liu, Shaoshan
    Zhu, Yuhao
    2022 IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER AIDED DESIGN, ICCAD, 2022,
  • [6] LIGHTWEIGHT AND FAST MATCHING METHOD FOR LIDAR-INERTIAL ODOMETRY AND MAPPING
    Li, Chuanjiang
    Hu, Ziwei
    Zhu, Yanfei
    Ji, Xingzhao
    Zhang, Chongming
    Qi, Ziming
    INTERNATIONAL JOURNAL OF ROBOTICS & AUTOMATION, 2024, 39 (05): : 338 - 348
  • [7] DAMS-LIO: A Degeneration-Aware and Modular Sensor-Fusion LiDAR-inertial Odometry
    Hai, Fuzhang
    Zheng, Han
    Huang, Wenjun
    Xiong, Rong
    Wang, Yue
    Jiao, Yanmei
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 2745 - 2751
  • [8] Tightly Coupled LiDAR-Inertial Odometry and Mapping for Underground Environments
    Chen, Jianhong
    Wang, Hongwei
    Yang, Shan
    SENSORS, 2023, 23 (15)
  • [9] Swarm-LIO: Decentralized Swarm LiDAR-inertial Odometry
    Zhu, Fangcheng
    Ren, Yunfan
    Kong, Fanze
    Wu, Huajie
    Liang, Siqi
    Chen, Nan
    Xu, Wei
    Zhang, Fu
    2023 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA, 2023, : 3254 - 3260
  • [10] SW-LIO: A Sliding Window Based Tightly Coupled LiDAR-Inertial Odometry
    Wang, Zelin
    Liu, Xu
    Yang, Limin
    Gao, Feng
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (10): : 6675 - 6682