z-logo
open-access-imgOpen Access
Structured Learning and Prediction in Facial Emotion Classification and Recognition
Author(s) -
Khalid Ounachad,
Mohamed Oualla,
Abdelghani Souhar,
Abdelalim Sadiq
Publication year - 2020
Publication title -
international journal of engineering and advanced technology
Language(s) - English
Resource type - Journals
ISSN - 2249-8958
DOI - 10.35940/ijeat.c6421.049420
Subject(s) - facial expression , artificial intelligence , computer science , face (sociological concept) , facial recognition system , set (abstract data type) , emotion recognition , point (geometry) , identification (biology) , pattern recognition (psychology) , three dimensional face recognition , image (mathematics) , emotion classification , speech recognition , face detection , mathematics , social science , botany , geometry , sociology , biology , programming language
Structured prediction methods have become, in recent years, an attractive tool for many machine-learning applications especially in the image processing area as in customers satisfaction prediction by using facial recognition systems, in criminal investigations based on face sketches recognition, in aid to autistic children and so. The main objective of this paper is the identification of the emotion of the human being, based on their facial expressions, by applying structured learning and perfect face ratios. The basic idea of our approach is to extract the perfect face ratios from a facial emotion image as the features, this face emotional images are labeled with their kind of emotions (the seven emotions defined in literature). For this end, first we determined sixty-eight landmarks point of image faces, next we applied a new deep geometric descriptor to calculate sixteen features representing the emotional face. The training and the testing tasks are applied to the Warsaw dataset: The Set of Emotional Facial Expression Pictures (WSEFEP) dataset. Our proposed approach can be also applied in others competitor facial emotion datasets. Based on experiments, the evaluation demonstrates the satisfactory performance of our applied method, the recognition rate reaches more than 97% for all seven emotions studied and it exceeds 99.20% for neutral facial images.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here