Thumb ticker sm headshot2021
Perzeptive Systeme
Director
no image
Perzeptive Systeme
Thumb ticker sm p9137973 hirschberg web
Perzeptive Systeme
no image
Perzeptive Systeme
Representativecrop

We describe a complete system for animating realistic clothing on synthetic bodies of any shape and pose without manual intervention. The key component of the method is a model of clothing called DRAPE (DRessing Any PErson) that is learned from a physics-based simulation of clothing on bodies of different shapes and poses. The DRAPE model has the desirable property of "factoring" clothing deformations due to body shape from those due to pose variation. This factorization provides an approximation to the physical clothing deformation and greatly simplifies clothing synthesis. Given a parameterized model of the human body with known shape and pose parameters, we describe an algorithm that dresses the body with a garment that is customized to fit and possesses realistic wrinkles. DRAPE can be used to dress static bodies or animated sequences with a learned model of the cloth dynamics. Since the method is fully automated, it is appropriate for dressing large numbers of virtual characters of varying shape. The method is significantly more efficient than physical simulation.

Author(s): Guan, P. and Reiss, L. and Hirshberg, D. and Weiss, A. and Black, M. J.
Links:
Journal: ACM Trans. on Graphics (Proc. SIGGRAPH)
Volume: 31
Number (issue): 4
Pages: 35:1--35:10
Year: 2012
Month: July
Project(s):
Bibtex Type: Article (article)
Electronic Archiving: grant_archive

BibTex

@article{DRAPE2012,
  title = {{DRAPE: DRessing Any PErson}},
  journal = {ACM Trans. on Graphics (Proc. SIGGRAPH)},
  abstract = {We describe a complete system for animating realistic clothing on synthetic bodies of any shape and pose without manual intervention. The key component of the method is a model of clothing called DRAPE (DRessing Any PErson) that is learned from a physics-based simulation of clothing on bodies of different shapes and poses. The DRAPE model has the desirable property of "factoring" clothing deformations due to body shape from those due to pose variation. This factorization provides an approximation to the physical clothing deformation and greatly simplifies clothing synthesis. Given a parameterized model of the human body with known shape and pose parameters, we describe an algorithm that dresses the body with a garment that is customized to fit and possesses realistic wrinkles. DRAPE can be used to dress static bodies or animated sequences with a learned model of the cloth dynamics. Since the method is fully automated, it is appropriate for dressing large numbers of virtual characters of varying shape. The method is significantly more efficient than physical simulation.},
  volume = {31},
  number = {4},
  pages = {35:1--35:10},
  month = jul,
  year = {2012},
  slug = {drape2012},
  author = {Guan, P. and Reiss, L. and Hirshberg, D. and Weiss, A. and Black, M. J.},
  month_numeric = {7}
}