TransFusionOdom: Transformer-Based LiDAR-Inertial Fusion Odometry Estimation

被引:14
|
作者
Sun, Leyuan [1 ,2 ]
Ding, Guanqun [3 ]
Qiu, Yue [2 ]
Yoshiyasu, Yusuke [2 ]
Kanehiro, Fumio [1 ,4 ]
机构
[1] Natl Inst Adv Ind Sci & Technol, CNRS AIST Joint Robot Lab JRL, IRL, Tsukuba 3058560, Japan
[2] Natl Inst Adv Ind Sci & Technol, Comp Vis Res Team, Artificial Intelligence Res Ctr AIRC, Tsukuba 3058560, Japan
[3] Natl Inst Adv Ind Sci & Technol, Digital Architecture Res Ctr DigiARC, Tokyo 1350064, Japan
[4] Univ Tsukuba, Grad Sch Sci & Technol, Dept Intelligent & Mech Interact Syst, Tsukuba 3050006, Japan
基金
日本学术振兴会;
关键词
Attention mechanisms; LiDAR-inertial odometry (LIO); multimodal learning; sensor data fusion; transformer; ROBUST; DEPTH; CNN;
D O I
10.1109/JSEN.2023.3302401
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Multimodal fusion of sensors is a commonly used approach to enhance the performance of odometry estimation, which is also a fundamental module for mobile robots. Recently, learning-based approaches garner the attention in this field, due to their robust nonhandcrafted designs. However, the question of How to perform fusion among different modalities in a supervised sensor fusion odometry estimation task? is one of the challenging issues still remaining. Some simple operations, such as elementwise summation and concatenation, are not capable of assigning adaptive attentional weights to incorporate different modalities efficiently, which makes it difficult to achieve competitive odometry results. Besides, the Transformer architecture has shown potential for multimodal fusion tasks, particularly in the domains of vision with language. In this work, we propose an end-to-end supervised Transformer-based LiDAR-Inertial fusion framework (namely TransFusionOdom) for odometry estimation. The multiattention fusion module demonstrates different fusion approaches for homogeneous and heterogeneous modalities to address the overfitting problem that can arise from blindly increasing the complexity of the model. Additionally, to interpret the learning process of the Transformer-based multimodal interactions, a general visualization approach is introduced to illustrate the interactions between modalities. Moreover, exhaustive ablation studies evaluate different multimodal fusion strategies to verify the performance of the proposed fusion strategy. A synthetic multimodal dataset is made public to validate the generalization ability of the proposed fusion strategy, which also works for other combinations of different modalities. The quantitative and qualitative odometry evaluations on the KITTI dataset verify that the proposed TransFusionOdom can achieve superior performance compared with other learning-based related works.
引用
收藏
页码:22064 / 22079
页数:16
相关论文
共 50 条
  • [21] Tightly-coupled Lidar-inertial Odometry and Mapping in Real Time
    Dai, Wei
    Tian, Bailing
    Chen, Hongming
    PROCEEDINGS OF THE 39TH CHINESE CONTROL CONFERENCE, 2020, : 3258 - 3263
  • [22] RTLIO: Real-Time LiDAR-Inertial Odometry and Mapping for UAVs
    Yang, Jung-Cheng
    Lin, Chun-Jung
    You, Bing-Yuan
    Yan, Yin-Long
    Cheng, Teng-Hu
    SENSORS, 2021, 21 (12)
  • [23] Graph-based LiDAR-Inertial SLAM Enhanced by Loosely-Coupled Visual Odometry
    Hulchuk, Vsevolod
    Bayer, Jan
    Faigl, Jan
    2023 EUROPEAN CONFERENCE ON MOBILE ROBOTS, ECMR, 2023, : 278 - 285
  • [24] Ground-LIO: enhanced LiDAR-inertial odometry for ground robots based on ground optimization
    Zhu, Housheng
    Zou, Chunlong
    Yun, Juntong
    Jiang, Du
    Huang, Li
    Liu, Ying
    Tao, Bo
    Xie, Yuanmin
    MEASUREMENT SCIENCE AND TECHNOLOGY, 2025, 36 (01)
  • [25] Tightly-Coupled LiDAR-inertial Odometry for Wheel-based Skid Steering UGV
    Li, Mengkai
    Wang, Lei
    Ren, Wenhu
    Liu, Qi
    Liu, Chaoyang
    2022 IEEE 17TH CONFERENCE ON INDUSTRIAL ELECTRONICS AND APPLICATIONS (ICIEA), 2022, : 510 - 516
  • [26] Lmapping: tightly-coupled LiDAR-inertial odometry and mapping for degraded environments
    Zou, Jingliang
    Shao, Liang
    Tang, Heshen
    Chen, Huangsong
    Bao, Haoran
    Pan, Xiaoming
    INTELLIGENT SERVICE ROBOTICS, 2023, 16 (05) : 583 - 597
  • [27] R-LIOM: Reflectivity-Aware LiDAR-Inertial Odometry and Mapping
    Dong, Yanchao
    Li, Lingxiao
    Xu, Sixiong
    Li, Wenxuan
    Li, Jinsong
    Zhang, Yahe
    He, Bin
    IEEE ROBOTICS AND AUTOMATION LETTERS, 2023, 8 (11) : 7743 - 7750
  • [28] DY-LIO: Tightly Coupled LiDAR-Inertial Odometry for Dynamic Environments
    Zou, Jingliang
    Chen, Huangsong
    Shao, Liang
    Bao, Haoran
    Tang, Hesheng
    Xiang, Jiawei
    Liu, Jun
    IEEE SENSORS JOURNAL, 2024, 24 (21) : 34756 - 34765
  • [29] Lmapping: tightly-coupled LiDAR-inertial odometry and mapping for degraded environments
    Jingliang Zou
    Liang Shao
    Heshen Tang
    Huangsong Chen
    Haoran Bao
    Xiaoming Pan
    Intelligent Service Robotics, 2023, 16 : 583 - 597
  • [30] VE-LIOM: A Versatile and Efficient LiDAR-Inertial Odometry and Mapping System
    Gao, Yuhang
    Zhao, Long
    REMOTE SENSING, 2024, 16 (15)