Perceiving Systems PS:License 1.0 2019-09-13

AMASS Dataset

Thumb ticker sm me pic large
Perceiving Systems
  • Research Engineer
Thumb ticker sm profile
Perceiving Systems
Guest Scientist
Thumb ticker sm img 20170501 231243
Perceiving Systems
Affiliated Researcher
Thumb ticker sm headshot2021
Perceiving Systems
Director
Thumb ticker sm troje
Perceiving Systems
Professor, York University, Canada (Sabbatical: Jan-June 2015)
Amass

AMASS is a large dataset of human motions - 45 hours and growing. AMASS enables the training of deep neural networks to model human motion. AMASS unifies multiple datasets by fitting the SMPL body model to mocap markers. The dataset includes SMPL-H body shapes and poses as well as DMPL soft tissue motions. If you want to include your own mocap sequences in the dataset, please contact us. The release includes tutorial code for training DNNs with AMASS. Also the MoSh++ code is now available. We also release SOMA, our complementary tool for automatic mocap labeling.

Release Date: 13 September 2019
licence_type: PS:License 1.0
Authors: Naureen Mahmood and Nima Ghorbani and Nikolaus F. Troje and Gerard Pons-Moll and Michael J. Black
Maintainers: Nima Ghorbani
Link (URL): https://amass.is.tue.mpg.de/
Repository: https://github.com/nghorbani/amass