Back

Perzeptive Systeme Members Publications

Members

Perzeptive Systeme
Emeritiertes Wissenschaftliches Mitglied / Kommissarischer Direktor
Perzeptive Systeme
Guest Scientist
Perzeptive Systeme
Perzeptive Systeme
Affiliated Researcher
Perzeptive Systeme
Affiliated Researcher
Movement Generation and Control
Visiting Researcher

Publications

Perceiving Systems Movement Generation and Control Article Robust Physics-based Motion Retargeting with Realistic Body Shapes Borno, M. A., Righetti, L., Black, M. J., Delp, S. L., Fiume, E., Romero, J. Computer Graphics Forum, 37:6:1-12, July 2018
Motion capture is often retargeted to new, and sometimes drastically different, characters. When the characters take on realistic human shapes, however, we become more sensitive to the motion looking right. This means adapting it to be consistent with the physical constraints imposed by different body shapes. We show how to take realistic 3D human shapes, approximate them using a simplified representation, and animate them so that they move realistically using physically-based retargeting. We develop a novel spacetime optimization approach that learns and robustly adapts physical controllers to new bodies and constraints. The approach automatically adapts the motion of the mocap subject to the body shape of a target subject. This motion respects the physical properties of the new body and every body shape results in a different and appropriate movement. This makes it easy to create a varied set of motions from a single mocap sequence by simply varying the characters. In an interactive environment, successful retargeting requires adapting the motion to unexpected external forces. We achieve robustness to such forces using a novel LQR-tree formulation. We show that the simulated motions look appropriate to each character’s anatomy and their actions are robust to perturbations.
pdf video BibTeX

Perceiving Systems Article Data-Driven Physics for Human Soft Tissue Animation Kim, M., Pons-Moll, G., Pujades, S., Bang, S., Kim, J., Black, M. J., Lee, S. ACM Transactions on Graphics, (Proc. SIGGRAPH), 36(4):54:1-54:12, 2017
Data driven models of human poses and soft-tissue deformations can produce very realistic results, but they only model the visible surface of the human body and cannot create skin deformation due to interactions with the environment. Physical simulations can generalize to external forces, but their parameters are difficult to control. In this paper, we present a layered volumetric human body model learned from data. Our model is composed of a data-driven inner layer and a physics-based external layer. The inner layer is driven with a volumetric statistical body model (VSMPL). The soft tissue layer consists of a tetrahedral mesh that is driven using the finite element method (FEM). Model parameters, namely the segmentation of the body into layers and the soft tissue elasticity, are learned directly from 4D registrations of humans exhibiting soft tissue deformations. The learned two layer model is a realistic full-body avatar that generalizes to novel motions and external forces. Experiments show that the resulting avatars produce realistic results on held out sequences and react to external forces. Moreover, the model supports the retargeting of physical properties from one avatar when they share the same topology.
video paper URL BibTeX