z-logo
Premium
Learning to disentangle emotion factors for facial expression recognition in the wild
Author(s) -
Zhu Qing,
Gao Lijian,
Song Heping,
Mao Qirong
Publication year - 2021
Publication title -
international journal of intelligent systems
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.291
H-Index - 87
eISSN - 1098-111X
pISSN - 0884-8173
DOI - 10.1002/int.22391
Subject(s) - discriminative model , computer science , benchmark (surveying) , salient , artificial intelligence , representation (politics) , pattern recognition (psychology) , expression (computer science) , facial expression , emotion recognition , encoder , set (abstract data type) , autoencoder , latent variable , block (permutation group theory) , deep learning , mathematics , geometry , geodesy , politics , political science , law , programming language , geography , operating system
Facial expression recognition (FER) in the wild is a very challenging problem due to different expressions under complex scenario (e.g., large head pose, illumination variation, occlusions, etc.), leading to suboptimal FER performance. Accuracy in FER heavily relies on discovering superior discriminative, emotion‐related features. In this paper, we propose an end‐to‐end module to disentangle latent emotion discriminative factors from the complex factors variables for FER to obtain salient emotion features. The training of proposed method contains two stages. First of all, emotion samples are used to obtain the latent representation using a variational auto‐encoder with reconstruction penalization. Furthermore, the latent representation as the input is thrown into a disentangling layer to learn a set of discriminative emotion factors through the attention mechanism (e.g., a Squeeze‐and‐Excitation block) that encourages to separate emotion‐related factors and nonaffective factors. Experimental results on public benchmark databases (RAF‐DB and FER2013) show that our approach has remarkable performance in complex scenes than current state‐of‐the‐art methods.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here