z-logo
Premium
3D Body Shapes Estimation from Dressed‐Human Silhouettes
Author(s) -
Song Dan,
Tong Ruofeng,
Chang Jian,
Yang Xiaosong,
Tang Min,
Zhang Jian Jun
Publication year - 2016
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.13012
Subject(s) - body shape , human body , computer science , computer vision , artificial intelligence , construct (python library) , human body model , parametric statistics , process (computing) , pose , geometric primitive , clothing , motion capture , computer graphics (images) , mathematics , geography , motion (physics) , statistics , archaeology , programming language , operating system
Estimation of 3D body shapes from dressed‐human photos is an important but challenging problem in virtual fitting. We propose a novel automatic framework to efficiently estimate 3D body shapes under clothes. We construct a database of 3D naked and dressed body pairs, based on which we learn how to predict 3D positions of body landmarks (which further constrain a parametric human body model) automatically according to dressed‐human silhouettes. Critical vertices are selected on 3D registered human bodies as landmarks to represent body shapes, so as to avoid the time‐consuming vertices correspondences finding process for parametric body reconstruction. Our method can estimate 3D body shapes from dressed‐human silhouettes within 4 seconds, while the fastest method reported previously need 1 minute. In addition, our estimation error is within the size tolerance for clothing industry. We dress 6042 naked bodies with 3 sets of common clothes by physically based cloth simulation technique. To the best of our knowledge, We are the first to construct such a database containing 3D naked and dressed body pairs and our database may contribute to the areas of human body shapes estimation and cloth simulation.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here