Premium
Analysis of co‐articulation regions for performance‐driven facial animation
Author(s) -
Fidaleo Douglas,
Neumann Ulrich
Publication year - 2004
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.4
Subject(s) - computer science , gesture , artificial intelligence , articulation (sociology) , set (abstract data type) , parameterized complexity , independent component analysis , face (sociological concept) , computer vision , pattern recognition (psychology) , animation , sample (material) , correlation , facial motion capture , speech recognition , facial recognition system , computer graphics (images) , mathematics , face detection , algorithm , social science , chemistry , geometry , chromatography , sociology , politics , political science , law , programming language
A facial gesture analysis procedure is presented for the control of animated faces. Facial images are partitioned into a set of local, independently actuated regions of appearance change termed co‐articulation regions (CRs). Each CR is parameterized by the activation level of a set of face gestures that affect the region. The activation of a CR is analyzed using independent component analysis (ICA) on a set of training images acquired from an actor. Gesture intensity classification is performed in ICA space by correlation to training samples. Correlation in ICA space proves to be an efficient and stable method for gesture intensity classification with limited training data. A discrete sample‐based synthesis method is also presented. An artist creates an actor‐independent reconstruction sample database that is indexed with CR state information analyzed in real time from video. Copyright © 2004 John Wiley & Sons, Ltd.