Autonomous Motion Members Publications

Videos Code & Data

Members

Empirical Inference
  • Postdoctoral Researcher
Autonomous Motion
  • Affiliated Researcher
Autonomous Motion
Autonomous Motion
Autonomous Motion
  • Doctoral Researcher
Autonomous Motion
  • Director
Autonomous Motion
Autonomous Motion
Autonomous Motion, Perceiving Systems
Intelligent Control Systems
Autonomous Motion

Publications

Autonomous Motion Article Real-time Perception meets Reactive Motion Generation Kappler, D., Meier, F., Issac, J., Mainprice, J., Garcia Cifuentes, C., Wüthrich, M., Berenz, V., Schaal, S., Ratliff, N., Bohg, J. IEEE Robotics and Automation Letters, 3(3):1864-1871, July 2018 (Published)
We address the challenging problem of robotic grasping and manipulation in the presence of uncertainty. This uncertainty is due to noisy sensing, inaccurate models and hard-to-predict environment dynamics. Our approach emphasizes the importance of continuous, real-time perception and its tight integration with reactive motion generation methods. We present a fully integrated system where real-time object and robot tracking as well as ambient world modeling provides the necessary input to feedback controllers and continuous motion optimizers. Specifically, they provide attractive and repulsive potentials based on which the controllers and motion optimizer can online compute movement policies at different time intervals. We extensively evaluate the proposed system on a real robotic platform in four scenarios that exhibit either challenging workspace geometry or a dynamic environment. We compare the proposed integrated system with a more traditional sense-plan-act approach that is still widely used. In 333 experiments, we show the robustness and accuracy of the proposed system.
arxiv video video DOI URL BibTeX

Autonomous Motion Article Probabilistic Articulated Real-Time Tracking for Robot Manipulation Garcia Cifuentes, C., Issac, J., Wüthrich, M., Schaal, S., Bohg, J. IEEE Robotics and Automation Letters (RA-L), 2(2):577-584, April 2017 (Published)
We propose a probabilistic filtering method which fuses joint measurements with depth images to yield a precise, real-time estimate of the end-effector pose in the camera frame. This avoids the need for frame transformations when using it in combination with visual object tracking methods. Precision is achieved by modeling and correcting biases in the joint measurements as well as inaccuracies in the robot model, such as poor extrinsic camera calibration. We make our method computationally efficient through a principled combination of Kalman filtering of the joint measurements and asynchronous depth-image updates based on the Coordinate Particle Filter. We quantitatively evaluate our approach on a dataset recorded from a real robotic platform, annotated with ground truth from a motion capture system. We show that our approach is robust and accurate even under challenging conditions such as fast motion, significant and long-term occlusions, and time-varying biases. We release the dataset along with open-source code of our approach to allow for quantitative comparison with alternative approaches.
arXiv video code and dataset video PDF DOI BibTeX

Autonomous Motion Intelligent Control Systems Conference Paper Robust Gaussian Filtering using a Pseudo Measurement Wüthrich, M., Garcia Cifuentes, C., Trimpe, S., Meier, F., Bohg, J., Issac, J., Schaal, S. In Proceedings of the American Control Conference (ACC), Boston, MA, USA, July 2016 (Published)
Most widely-used state estimation algorithms, such as the Extended Kalman Filter and the Unscented Kalman Filter, belong to the family of Gaussian Filters (GF). Unfortunately, GFs fail if the measurement process is modelled by a fat-tailed distribution. This is a severe limitation, because thin-tailed measurement models, such as the analytically-convenient and therefore widely-used Gaussian distribution, are sensitive to outliers. In this paper, we show that mapping the measurements into a specific feature space enables any existing GF algorithm to work with fat-tailed measurement models. We find a feature function which is optimal under certain conditions. Simulation results show that the proposed method allows for robust filtering in both linear and nonlinear systems with measurements contaminated by fat-tailed noise.
Web DOI URL BibTeX

Autonomous Motion Intelligent Control Systems Conference Paper Depth-based Object Tracking Using a Robust Gaussian Filter Issac, J., Wüthrich, M., Garcia Cifuentes, C., Bohg, J., Trimpe, S., Schaal, S. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA) 2016, IEEE, IEEE International Conference on Robotics and Automation, May 2016 (Published)
We consider the problem of model-based 3D- tracking of objects given dense depth images as input. Two difficulties preclude the application of a standard Gaussian filter to this problem. First of all, depth sensors are characterized by fat-tailed measurement noise. To address this issue, we show how a recently published robustification method for Gaussian filters can be applied to the problem at hand. Thereby, we avoid using heuristic outlier detection methods that simply reject measurements if they do not match the model. Secondly, the computational cost of the standard Gaussian filter is prohibitive due to the high-dimensional measurement, i.e. the depth image. To address this problem, we propose an approximation to reduce the computational complexity of the filter. In quantitative experiments on real data we show how our method clearly outperforms the standard Gaussian filter. Furthermore, we compare its performance to a particle-filter-based tracking method, and observe comparable computational efficiency and improved accuracy and smoothness of the estimates.
Video Bayesian Object Tracking Library Bayesian Filtering Framework Object Tracking Dataset DOI URL BibTeX

Autonomous Motion Conference Paper The Coordinate Particle Filter - A novel Particle Filter for High Dimensional Systems Wüthrich, M., Bohg, J., Kappler, D., Pfreundt, C., Schaal, S. In Proceedings of the IEEE International Conference on Robotics and Automation, May 2015
Parametric filters, such as the Extended Kalman Filter and the Unscented Kalman Filter, typically scale well with the dimensionality of the problem, but they are known to fail if the posterior state distribution cannot be closely approximated by a density of the assumed parametric form. For nonparametric filters, such as the Particle Filter, the converse holds. Such methods are able to approximate any posterior, but the computational requirements scale exponentially with the number of dimensions of the state space. In this paper, we present the Coordinate Particle Filter which alleviates this problem. We propose to compute the particle weights recursively, dimension by dimension. This allows us to explore one dimension at a time, and resample after each dimension if necessary. Experimental results on simulated as well as real data con- firm that the proposed method has a substantial performance advantage over the Particle Filter in high-dimensional systems where not all dimensions are highly correlated. We demonstrate the benefits of the proposed method for the problem of multi-object and robotic manipulator tracking.
arXiv Video Bayesian Filtering Framework Bayesian Object Tracking DOI BibTeX

Autonomous Motion Conference Paper Probabilistic Object Tracking Using a Range Camera Wüthrich, M., Pastor, P., Kalakrishnan, M., Bohg, J., Schaal, S. In IEEE/RSJ International Conference on Intelligent Robots and Systems, 3195-3202, IEEE, November 2013
We address the problem of tracking the 6-DoF pose of an object while it is being manipulated by a human or a robot. We use a dynamic Bayesian network to perform inference and compute a posterior distribution over the current object pose. Depending on whether a robot or a human manipulates the object, we employ a process model with or without knowledge of control inputs. Observations are obtained from a range camera. As opposed to previous object tracking methods, we explicitly model self-occlusions and occlusions from the environment, e.g, the human or robotic hand. This leads to a strongly non-linear observation model and additional dependencies in the Bayesian network. We employ a Rao-Blackwellised particle filter to compute an estimate of the object pose at every time step. In a set of experiments, we demonstrate the ability of our method to accurately and robustly track the object pose in real-time while it is being manipulated by a human or a robot.
arXiv Video Code Video DOI BibTeX