Perzeptive Systeme PS:License 1.0 2020-08-25

GRAB: A Dataset of Whole-Body Human Grasping of Objects

Thumb ticker sm avatar 3dv
Perzeptive Systeme
  • Postdoctoral Researcher
Thumb ticker sm profile
Perzeptive Systeme
Guest Scientist
Thumb ticker sm headshot2021
Perzeptive Systeme
Director
Thumb ticker sm 04 1.58.1   crop3
Perzeptive Systeme
  • Guest Scientist
Grab thumb3

Training computers to understand, model, and synthesize human grasping requires a rich dataset containing complex 3D object shapes, detailed contact information, hand pose and shape, and the 3D body motion over time. While "grasping" is commonly thought of as a single hand stably lifting an object, we capture the motion of the entire body and adopt the generalized notion of "whole-body grasps". Thus, we collect a new dataset, called GRAB (GRasping Actions with Bodies), of whole-body grasps, containing full 3D shape and pose sequences of 10 subjects interacting with 51 everyday objects of varying shape and size. The dataset contains 1.622.459 frames in total. Each one has (1) an expressive 3D SMPL-X human mesh (shaped and posed), (2) a 3D rigid object mesh (posed), and (3) contact annotations (wherever applicable).

Release Date: 25 August 2020
licence_type: PS:License 1.0
Authors: Omid Taheri and Nima Ghorbani and Michael J. Black and Dimitrios Tzionas
Maintainers: Omid Taheri
Link (URL): https://grab.is.tue.mpg.de
Repository: https://github.com/otaheri/GRAB
Project(s):