Perceiving Systems Conference Paper 2025

ETCH: Generalizing Body Fitting to Clothed Humans via Equivariant Tightness

project arXiv code video
Thumb ticker sm img 4026
Perceiving Systems
  • Doctoral Researcher
Thumb ticker sm headshot2021
Perceiving Systems
Director
Thumb ticker sm 84b79ecc0b10a9579306eaa7c8037a
Perceiving Systems
  • Guest Scientist
Thumb xxl tightness

itting a body to a 3D clothed human point cloud is a common yet challenging task. Traditional optimization-based approaches use multi-stage pipelines that are sensitive to pose initialization, while recent learning-based methods often struggle with generalization across diverse poses and garment types. We propose Equivariant Tightness Fitting for Clothed Humans, or ETCH, a novel pipeline that estimates cloth-to-body surface mapping through locally approximate SE(3) equivariance, encoding tightness as displacement vectors from the cloth surface to the underlying body. Following this mapping, pose-invariant body features regress sparse body markers, simplifying clothed human fitting into an inner-body marker fitting task. Extensive experiments on CAPE and 4D-Dress show that ETCH significantly outperforms state-of-the-art methods -- both tightness-agnostic and tightness-aware -- in body fitting accuracy on loose clothing (16.7% ~ 69.5%) and shape accuracy (average 49.9%). Our equivariant tightness design can even reduce directional errors by (67.2% ~ 89.8%) in one-shot (or out-of-distribution) settings (~ 1% data). Qualitative results demonstrate strong generalization of ETCH, regardless of challenging poses, unseen shapes, loose clothing, and non-rigid dynamics.

Author(s): Li, Boqian and Feng, Haiwen and Cai, Zeyu and Black, Michael J and Xiu, Yuliang
Links:
Book Title: Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)
Year: 2025
Month: October
BibTeX Type: Conference Paper (inproceedings)
State: Published

BibTeX

@inproceedings{li2025etch,
  title = {{ETCH}: Generalizing Body Fitting to Clothed Humans via Equivariant Tightness},
  booktitle = {Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV)},
  abstract = {itting a body to a 3D clothed human point cloud is a common yet challenging task. Traditional optimization-based approaches use multi-stage pipelines that are sensitive to pose initialization, while recent learning-based methods often struggle with generalization across diverse poses and garment types. We propose Equivariant Tightness Fitting for Clothed Humans, or ETCH, a novel pipeline that estimates cloth-to-body surface mapping through locally approximate SE(3) equivariance, encoding tightness as displacement vectors from the cloth surface to the underlying body. Following this mapping, pose-invariant body features regress sparse body markers, simplifying clothed human fitting into an inner-body marker fitting task. Extensive experiments on CAPE and 4D-Dress show that ETCH significantly outperforms state-of-the-art methods -- both tightness-agnostic and tightness-aware -- in body fitting accuracy on loose clothing (16.7% ~ 69.5%) and shape accuracy (average 49.9%). Our equivariant tightness design can even reduce directional errors by (67.2% ~ 89.8%) in one-shot (or out-of-distribution) settings (~ 1% data). Qualitative results demonstrate strong generalization of ETCH, regardless of challenging poses, unseen shapes, loose clothing, and non-rigid dynamics. },
  month = oct,
  year = {2025},
  author = {Li, Boqian and Feng, Haiwen and Cai, Zeyu and Black, Michael J and Xiu, Yuliang},
  month_numeric = {10}
}