Premium
3D face alignment and registration in the presence of facial expression differences
Author(s) -
Pintavirooj Chuchart,
Cohen Fernand S.,
Tosra Prasong
Publication year - 2013
Publication title -
ieej transactions on electrical and electronic engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.254
H-Index - 30
eISSN - 1931-4981
pISSN - 1931-4973
DOI - 10.1002/tee.21871
Subject(s) - face (sociological concept) , facial expression , transformation (genetics) , invariant (physics) , iterative closest point , artificial intelligence , mathematics , gaussian , point set registration , point (geometry) , nonlinear system , matching (statistics) , computer vision , surface (topology) , expression (computer science) , computer science , pattern recognition (psychology) , algorithm , set (abstract data type) , geometry , point cloud , physics , statistics , social science , biochemistry , chemistry , quantum mechanics , sociology , mathematical physics , gene , programming language
This paper deals with the problem of 3D alignment of faces in the presence of some facial expression changes. The data are three dimensional and obtained using a laser scanner. Our approach is based on the dierential geometry of the surface and computes the intrinsic local ducial points on the surface and on curves that reside on the surface. Because these ducial points are local, they allow partial alignment, where part of the face is viewed. Moreover, since the ducial points are relatively ane‐invariant to local ane transformations, they allow matching and alignment when facial expression changes aect part of the face. A fast, noniterative alignment procedure is presented in this paper that establishes reliable correspondences between ducial points without any prior knowledge of the overall nonlinear global transformation that takes place after the changes in facial expressions. This is achieved through the construction of a set of ordered novel absolute local ane invariants. With enough ducial points set as correspondents, the overall nonlinear transformation is computed and the face before and after the transformation are aligned. For comparison, we also compare the alignment performance of our method with that of the iterative closest point (ICP) method and the coherent point drift (CPD) method which is based on the Gaussian mixture model by looking at the between‐to‐within variation (signal‐to‐noise ratio, SNR) between these two classes (match vs nonmatch) based on the average alignment errors for the genuine matches and nonmatches, normalized by their respective within variation, and by running it on the 3D face database GavabDB. The separation between the true match and nonmatch is best for our zero‐torsion method (SNR of 1.2), followed by the ICP method (SNR of 0.37)and the CPD method (SNR of 0.15). It is interesting to observe here that, although the alignment error for the true match cases is lowest for the CPD method, it is also extremely low for the nonmatch cases, which would mean that the method would force a query into a wrong face, yielding poor specicity (lots of false alarms). © 2013 Institute of Electrical Engineers of Japan. Published by John Wiley & Sons, Inc.