Human Pose, Shape and Action
3D Pose from Images
2D Pose from Images
Beyond Motion Capture
Action and Behavior
Body Perception
Body Applications
Pose and Motion Priors
Clothing Models (2011-2015)
Reflectance Filtering
Learning on Manifolds
Markerless Animal Motion Capture
Multi-Camera Capture
2D Pose from Optical Flow
Body Perception
Neural Prosthetics and Decoding
Part-based Body Models
Intrinsic Depth
Lie Bodies
Layers, Time and Segmentation
Understanding Action Recognition (JHMDB)
Intrinsic Video
Intrinsic Images
Action Recognition with Tracking
Neural Control of Grasping
Flowing Puppets
Faces
Deformable Structures
Model-based Anthropometry
Modeling 3D Human Breathing
Optical flow in the LGN
FlowCap
Smooth Loops from Unconstrained Video
PCA Flow
Efficient and Scalable Inference
Motion Blur in Layers
Facade Segmentation
Smooth Metric Learning
Robust PCA
3D Recognition
Object Detection
Learning to Grasp from Big Data

Data-driven methods towards grasping address the challenges of grasp synthesis that arise in the real world such as noisy sensors and incomplete information about the objects and the environment. They focus on finding a suitable representation of the perceptual data that allows to predict whether a certain grasp will succeed.
Current advances in many research areas, e.g. computer vision and speech recognition, which are based supervised data-driven methods, are due to the increase in computation capacity and available data. This work provides the first labeled dataset for grasp prediction, allowing to use state-of-the-art data-driven learning techniques such as deep learning and random forests, for grasp prediction of known and unkown objects.
Ideally labeled datasets for grasp prediction should be obtained from robot experiments. This procedure requires supervision and is mostly bound by the run-time of the robot experiments, rendering it infeasible for obtaining large datasets. We use simulation to generate feasible and unfeasible grasps, using physics simulation to narrow the simulation-reality gap. We empirically evaluate that the proposed grasp quality metric is consistent with human believe, using crowd-sourcing.
This dataset can be used to bootstrap data-driven techniques such that refinement for real world grasping only requires a small number robot experiments. First empirical results demonstrate that learning techniques with less capacity (linear methods), cannot cope with the variety of objects and thus cannot be used for general purpose grasp prediction.
In future work we want to analyze and empirically evaluate the data generation process itself, further closing the simulation-reality gap. This database will be extended with more objects and real robot experiments, including further data queues (tactile information) such that predictive models for the actual grasp execution can be learned as well.
Database
We released the large-scale robotic grasping database to the research community. It is freely available at http://grasp-database.dkappler.de.
It provides grasps that are applied to more than 700 distinct objects from over 80 categories. These grasps are generated in simulation and evaluated using the standard epsilon-metric and a new physics-metric. In crowdsourcing experiments, we have confirmed that the proposed physics-metric is a more consistent predictor for grasp success than the epsilon-metric.
In total, the database provides around 500.000 labeled grasp each annotated with stability labels from these different metrics. Additionally, we simulate noisy and incomplete perception of objects from different viewpoints using a realistic model of an RGB-D camera. This allows us to additionally link representations of local object shape to each grasp.
This database provides a very interesting dataset for learning how to grasp with techniques that can leverage big data.
Features include:
- Docker Container for easy installation of all dependencies
- Python Interface
- Data Visualization tool
- Efficient HDF5 database format
- Catkin Workspace
- Extendibility (Objects, Hands, Feature Representations)
Members
Publications