z-logo
open-access-imgOpen Access
Machine Learning Based Statistical Analysis of Emotion Recognition using Facial Expression
Author(s) -
Aqib Ali,
Jamal Abdul Nasir,
Muhammad Munawar Ahmed,
Samreen Naeem,
Sania Anam,
Farrukh Jamal,
Christophe Chesneau,
Muhammad Zubair,
Muhammad Saqib Anees
Publication year - 2020
Publication title -
rads journal of biological research and applied science
Language(s) - English
Resource type - Journals
eISSN - 2521-8573
pISSN - 2305-8722
DOI - 10.37962/jbas.v11i1.262
Subject(s) - artificial intelligence , computer science , pattern recognition (psychology) , c4.5 algorithm , random forest , decision tree , histogram , feature (linguistics) , adaboost , histogram of oriented gradients , support vector machine , machine learning , image (mathematics) , naive bayes classifier , linguistics , philosophy
Background: Humans can deliver many emotions during a conversation. Facial expressions show information about emotions. Objectives: This study proposed a Machine Learning (ML) approach based on a statistical analysis of emotion recognition using facial expression through a digital image. Methodology: A total of 600 digital image datasets divided into 6 classes (Anger, Happy, Fear, Surprise, Sad, and Normal) was collected from publicly available Taiwan Facial Expression Images Database. In the first step, all images are converted into a gray level format and 4 Regions of Interest (ROIs) are created on each image, so the total image dataset gets divided in 2400 (600 x 4) sub-images. In the second step, 3 types of statistical features named texture, histogram, and binary feature are extracted from each ROIs. The third step is a statistical feature optimization using the best-first search algorithm. Lastly, an optimized statistical feature dataset is deployed on various ML classifiers. Results: The analysis part was divided into two phases: firstly boosting algorithms-based ML classifiers (named as LogitBoost, AdaboostM1, and Stacking) which obtained 94.11%, 92.15%, and 89.21% accuracy, respectively. Secondly, decision tree algorithms named J48, Random Forest, and Random Committee were obtained with 97.05%, 93.14%, and 92.15% accuracy, respectively. Conclusion: It was observed that decision tree based J48 classifiers gave 97.05% classification accuracy.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here