Reconstructing Simulation-Ready Clothing with Photo-Realistic Appearance from Multi-View Video
A natural coupling of 3D meshes and 3D Gaussian splatting allows Gaussian Garments to represent both overall geometry and high-frequency details of human clothing.
The reconstructed garments can then be retargeted to novel human models, resized to fit novel body shapes, and simulated over moving bodies with novel motions.
Our approach also enables the automatic construction of complex multi-layer outfits from a set of separately captured Gaussian garments.
We introduce Gaussian Garments, a novel approach for reconstructing realistic-looking, simulation-ready garment assets from multi-view videos. Our method represents garments with a combination of a 3D mesh and a Gaussian texture that encodes both the color and high-frequency surface details. This representation enables accurate registration of garment geometries to multi-view videos and helps disentangle albedo textures from lighting effects. Furthermore, we demonstrate how a pre-trained Graph Neural Network (GNN) can be fine-tuned to replicate the real behavior of each garment. The reconstructed Gaussian Garments can be automatically combined into multi-garment outfits and animated with the fine-tuned GNN.