Embodied Vision Autonomous Motion Movement Generation and Control Conference Paper 2023

Visual-Inertial and Leg Odometry Fusion for Dynamic Locomotion

Thumb ticker sm pp
Embodied Vision
  • Master Student
Thumb ticker sm img 5012
Embodied Vision
Thumb ticker sm 3
Movement Generation and Control
  • Student Assistant
Thumb ticker sm dsc00314
Embodied Vision
Research Associate
Thumb ticker sm profile pic
Movement Generation and Control
Student Assistant
no image
Autonomous Motion
Thumb ticker sm shah  parth  1
Autonomous Motion
Thumb ticker sm felix
Robotics
Mechatronics Engineer
Thumb ticker sm righetti
Movement Generation and Control
Visiting Researcher
Thumb ticker sm majid khadiv
Empirical Inference
Thumb ticker sm profilepic stueckler
Embodied Vision
Max Planck Research Group Leader
Visual intertial

Implementing dynamic locomotion behaviors on legged robots requires a high-quality state estimation module. Especially when the motion includes flight phases, state-of-the-art approaches fail to produce reliable estimation of the robot posture, in particular base height. In this paper, we propose a novel approach for combining visual-inertial odometry (VIO) with leg odometry in an extended Kalman filter (EKF) based state estimator. The VIO module uses a stereo camera and IMU to yield low-drift 3D position and yaw orientation and drift-free pitch and roll orientation of the robot base link in the inertial frame. However, these values have a considerable amount of latency due to image processing and optimization, while the rate of update is quite low which is not suitable for low-level control. To reduce the latency, we predict the VIO state estimate at the rate of the IMU measurements of the VIO sensor. The EKF module uses the base pose and linear velocity predicted by VIO, fuses them further with a second high-rate IMU and leg odometry measurements, and produces robot state estimates with a high frequency and small latency suitable for control. We integrate this lightweight estimation framework with a nonlinear model predictive controller and show successful implementation of a set of agile locomotion behaviors, including trotting and jumping at varying horizontal speeds, on a torque-controlled quadruped robot.

Author(s): Dhédin, Victor and Li, Haolong and Khorshidi, Shahram and Mack, Lukas and Ravi, Adithya Kumar Chinnakkonda and Meduri, Avadesh and Shah, Parth and Grimminger, Felix and Righetti, Ludovic and Khadiv, Majid and Stueckler, Joerg
Book Title: Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)
Year: 2023
Project(s):
Bibtex Type: Conference Paper (inproceedings)
DOI: 10.1109/ICRA48891.2023.10160898
State: Published
URL: https://doi.org/10.1109/ICRA48891.2023.10160898
Electronic Archiving: grant_archive
Links:

BibTex

@inproceedings{dhedin2022vioonlegs,
  title = {Visual-Inertial and Leg Odometry Fusion for Dynamic Locomotion},
  booktitle = {Proceedings of the IEEE International Conference on Robotics and Automation (ICRA)},
  abstract = {Implementing dynamic locomotion behaviors on legged robots requires a high-quality state estimation module. Especially when the motion includes flight phases, state-of-the-art approaches fail to produce reliable estimation of the robot posture, in particular base height. In this paper, we propose a novel approach for combining visual-inertial odometry (VIO) with leg odometry in an extended Kalman filter (EKF) based state estimator. The VIO module uses a stereo camera and IMU to yield low-drift 3D position and yaw orientation and drift-free pitch and roll orientation of the robot base link in the inertial frame. However, these values have a considerable amount of latency due to image processing and optimization, while the rate of update is quite low which is not suitable for low-level control. To reduce the latency, we predict the VIO state estimate at the rate of the IMU measurements of the VIO sensor. The EKF module uses the base pose and linear velocity predicted by VIO, fuses them further with a second high-rate IMU and leg odometry measurements, and produces robot state estimates with a high frequency and small latency suitable for control. We integrate this lightweight estimation framework with a nonlinear model predictive controller and show successful implementation of a set of agile locomotion behaviors, including trotting and jumping at varying horizontal speeds, on a torque-controlled quadruped robot.},
  year = {2023},
  slug = {dhedin2022vioonlegs},
  author = {Dhédin, Victor and Li, Haolong and Khorshidi, Shahram and Mack, Lukas and Ravi, Adithya Kumar Chinnakkonda and Meduri, Avadesh and Shah, Parth and Grimminger, Felix and Righetti, Ludovic and Khadiv, Majid and Stueckler, Joerg},
  url = {https://doi.org/10.1109/ICRA48891.2023.10160898}
}