Visual-Inertial and Leg Odometry Fusion for Dynamic Locomotion

被引:3
|
作者
Dhedin, Victor [1 ]
Li, Haolong [1 ]
Khorshidi, Shahram [2 ]
Mack, Lukas [1 ]
Ravi, Adithya Kumar Chinnakkonda [1 ,2 ]
Meduri, Avadesh [3 ]
Shah, Paarth [4 ]
Grimminger, Felix [5 ]
Righetti, Ludovic [2 ,3 ]
Khadiv, Majid [2 ]
Stueckler, Joerg [1 ]
机构
[1] Max Planck Inst Intelligent Syst, Embodied Vis Grp, Tubingen, Germany
[2] Max Planck Inst Intelligent Syst, Movement Generat & Control Grp, Tubingen, Germany
[3] NYU, Tandon Sch Engn, New York, NY 10003 USA
[4] Univ Oxford, Oxford Robot Inst, Oxford, England
[5] Max Planck Inst Intelligent Syst, Autonomous Mot Dept, Tubingen, Germany
关键词
D O I
10.1109/ICRA48891.2023.10160898
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Implementing dynamic locomotion behaviors on legged robots requires a high-quality state estimation module. Especially when the motion includes flight phases, state-of-the-art approaches fail to produce reliable estimation of the robot posture, in particular base height. In this paper, we propose a novel approach for combining visual-inertial odometry (VIO) with leg odometry in an extended Kalman filter (EKF) based state estimator. The VIO module uses a stereo camera and IMU to yield low-drift 3D position and yaw orientation and drift-free pitch and roll orientation of the robot base link in the inertial frame. However, these values have a considerable amount of latency due to image processing and optimization, while the rate of update is quite low which is not suitable for low-level control. To reduce the latency, we predict the VIO state estimate at the rate of the IMU measurements of the VIO sensor. The EKF module uses the base pose and linear velocity predicted by VIO, fuses them further with a second high-rate IMU and leg odometry measurements, and produces robot state estimates with a high frequency and small latency suitable for control. We integrate this lightweight estimation framework with a nonlinear model predictive controller and show successful implementation of a set of agile locomotion behaviors, including trotting and jumping at varying horizontal speeds, on a torque-controlled quadruped robot.
引用
收藏
页码:9966 / 9972
页数:7
相关论文
共 50 条
  • [31] Stereo Event-Based Visual-Inertial Odometry
    Wang, Kunfeng
    Zhao, Kaichun
    Lu, Wenshuai
    You, Zheng
    SENSORS, 2025, 25 (03)
  • [32] Using Vanishing Points to Improve Visual-Inertial Odometry
    Camposeco, Federico
    Pollefeys, Marc
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA), 2015, : 5219 - 5225
  • [33] Visual-Inertial Odometry of Smartphone under Manhattan World
    Wang, YuAn
    Chen, Liang
    Wei, Peng
    Lu, XiangChen
    REMOTE SENSING, 2020, 12 (22) : 1 - 27
  • [34] Active Heading Planning for Improving Visual-Inertial Odometry
    Lee, Joohyuk
    Lee, Kyuman
    2024 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS, ICUAS, 2024, : 1085 - 1092
  • [35] Dense Visual-Inertial Odometry for Tracking of Aggressive Motions
    Ling, Yonggen
    Shen, Shaojie
    2015 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND BIOMIMETICS (ROBIO), 2015, : 576 - 583
  • [36] A Stereo-Based Visual-Inertial Odometry for SLAM
    Li, Yong
    Lang, ShiBing
    2019 CHINESE AUTOMATION CONGRESS (CAC2019), 2019, : 594 - 598
  • [37] A Monocular Visual-Inertial Odometry Based on Hybrid Residuals
    Lai, Zhenghong
    Gui, Jianjun
    Xu, Dengke
    Dong, Hongbin
    Deng, Baosong
    2020 IEEE INTERNATIONAL CONFERENCE ON SYSTEMS, MAN, AND CYBERNETICS (SMC), 2020, : 3304 - 3311
  • [38] Renormalization for Initialization of Rolling Shutter Visual-Inertial Odometry
    Branislav Micusik
    Georgios Evangelidis
    International Journal of Computer Vision, 2021, 129 : 2011 - 2027
  • [39] Control-enabled Observability in Visual-Inertial Odometry
    Bai, He
    Taylor, Clark N.
    2017 INTERNATIONAL CONFERENCE ON UNMANNED AIRCRAFT SYSTEMS (ICUAS'17), 2017, : 822 - 829
  • [40] Continuous-Time Spline Visual-Inertial Odometry
    Mo, Jiawei
    Sattar, Junaed
    2022 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION, ICRA 2022, 2022, : 9492 - 9498