z-logo
Premium
Pattern Recognition of Self‐Reported Emotional State from Multiple‐Site Facial EMG Activity During Affective Imagery
Author(s) -
Fridlund Alan J.,
Schwartz Gary E.,
Fowler Stephen C.
Publication year - 1984
Publication title -
psychophysiology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.661
H-Index - 156
eISSN - 1469-8986
pISSN - 0048-5772
DOI - 10.1111/j.1469-8986.1984.tb00249.x
Subject(s) - sadness , psychology , anger , facial electromyography , happiness , affect (linguistics) , emotion classification , facial expression , linear discriminant analysis , multivariate statistics , cognitive psychology , emotion recognition , facial muscles , audiology , communication , artificial intelligence , social psychology , neuroscience , statistics , medicine , mathematics , computer science
A multivariate pattern‐classification system was developed for the study of facial electromy‐ographic (EMG) patterning in 12 female subjects during affect‐laden imagery and for posed facial expressions. A parameter‐extraction procedure identified the dynamic EMG signal properties which accorded the maximal degree of self‐reported emotion discrimination. Discriminant analyses on trialwise EMG vectors allowed assessment of specific EMG‐site conformations typifying rated emotions of happiness, sadness, anger, and fear. The discriminability among emotion‐specific EMG conformations was correlated with subjective ratings of affective‐imagery vividness and duration. Evidence was obtained suggesting that the EMG patterns encoded complex, “blended” reported affective states during the imagery. Classification analyses produced point‐predictions of reported emotional states in 10 of the 12 subjects, and provided the first computer pattern recognition of self‐reported emotion from psychophysiological responses.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here