z-logo
open-access-imgOpen Access
Recognition of Human Inner Emotion Based on Two-Stage FCA-ReliefF Feature Optimization
Author(s) -
Lizheng Pan,
Shunchao Wang,
Zeming Yin,
Aiguo Song
Publication year - 2022
Publication title -
informacinės technologijos ir valdymas
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.286
H-Index - 19
eISSN - 2335-884X
pISSN - 1392-124X
DOI - 10.5755/j01.itc.51.1.29430
Subject(s) - computer science , pattern recognition (psychology) , artificial intelligence , feature (linguistics) , classifier (uml) , support vector machine , toolbox , curse of dimensionality , correlation , emotion recognition , feature extraction , dimensionality reduction , machine learning , mathematics , philosophy , linguistics , geometry , programming language
Currently, there is a growing interesting in emotion recognition. Representation of emotional states is a very challenging issue. Considering the calculation cost and generalization capability for practical application, a series of features which contain common time and frequency domain are extracted from physiological signals to represent different emotional states. To reduce feature dimensionality and improve the emotion recognition accuracy, a two-stage feature optimization method based on feature correlation analysis (FCA) and ReliefF algorithm is proposed to select critical features. Firstly, FCA is employed to analyze the redundancy between features, then ReliefF is adopted to analyze the correlation between features and categories, and the optimal feature subset is obtained using the two-stage FCA-ReliefF feature optimization method. Support vector machine is employed as the classifier to evaluate classification performance in this investigation. The effectiveness of the method which is proposed is validated by testing on two publicly available multimodal emotion datasets, Augsburg Biosignal Toolbox (AuBT) and Database for Emotion Analysis Using Physiological Signals (DEAP). Compared with recent similar reported studies, the method developed in this research for emotion recognition is stable and competitive, and its accuracy reaches to 98.40% (AuBT) and 92.34% (DEAP).

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here