Skip to Main content Skip to Navigation
Journal articles

ClothCap: Seamless 4D Clothing Capture and Retargeting

Abstract : Designing and simulating realistic clothing is challenging. Previous methods addressing the capture of clothing from 3D scans have been limited to single garments and simple motions, lack detail, or require specialized texture patterns. Here we address the problem of capturing regular clothing on fully dressed people in motion. People typically wear multiple pieces of clothing at a time. To estimate the shape of such clothing, track it over time, and render it believably, each garment must be segmented from the others and the body. Our ClothCap approach uses a new multi-part 3D model of clothed bodies, automatically segments each piece of clothing, estimates the minimally clothed body shape and pose under the clothing, and tracks the 3D deformations of the clothing over time. We estimate the garments and their motion from 4D scans; that is, high-resolution 3D scans of the subject in motion at 60 fps. ClothCap is able to capture a clothed person in motion, extract their clothing, and retarget the clothing to new body shapes; this provides a step towards virtual try-on.
Document type :
Journal articles
Complete list of metadata

Cited literature [63 references]  Display  Hide  Download


https://hal.inria.fr/hal-02162166
Contributor : Sergi Pujades <>
Submitted on : Friday, June 21, 2019 - 3:16:10 PM
Last modification on : Wednesday, January 6, 2021 - 4:10:05 PM

Files

clothcap.pdf
Files produced by the author(s)

Identifiers

Citation

Gerard Pons-Moll, Sergi Pujades, Sonny Hu, Michael Black. ClothCap: Seamless 4D Clothing Capture and Retargeting. ACM Transactions on Graphics, Association for Computing Machinery, 2017, 36 (4), pp.1-15. ⟨10.1145/3072959.3073711⟩. ⟨hal-02162166⟩

Share

Metrics

Record views

263

Files downloads

943