
Using genetic algorithms to uncover individual differences in how humans represent facial emotion
Author(s) -
Christina Carlisi,
Kyle Reed,
Fleur G. L. Helmink,
Robert F. Lachlan,
Darren Cosker,
Essi Viding,
Isabelle Mareschal
Publication year - 2021
Publication title -
royal society open science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.84
H-Index - 51
ISSN - 2054-5703
DOI - 10.1098/rsos.202251
Subject(s) - psychology , facial expression , cognitive psychology , categorical variable , emotion perception , cognition , population , stimulus (psychology) , emotion classification , computer science , communication , machine learning , demography , neuroscience , sociology
Emotional facial expressions critically impact social interactions and cognition. However, emotion research to date has generally relied on the assumption that people represent categorical emotions in the same way, using standardized stimulus sets and overlooking important individual differences. To resolve this problem, we developed and tested a task using genetic algorithms to derive assumption-free, participant-generated emotional expressions. One hundred and five participants generated a subjective representation of happy, angry, fearful and sad faces. Population-level consistency was observed for happy faces, but fearful and sad faces showed a high degree of variability. High test–retest reliability was observed across all emotions. A separate group of 108 individuals accurately identified happy and angry faces from the first study, while fearful and sad faces were commonly misidentified. These findings are an important first step towards understanding individual differences in emotion representation, with the potential to reconceptualize the way we study atypical emotion processing in future research.