z-logo
open-access-imgOpen Access
Creation and validation of the Picture-Set of Young Children’s Affective Facial Expressions (PSYCAFE)
Author(s) -
Matthias Franz,
Tobias Müller,
Sina Hahn,
Daniel Lundqvist,
Dirk Rampoldt,
Jan-Frederik Westermann,
Marc A. Nordmann,
Ralf B. Schäfer
Publication year - 2021
Publication title -
plos one
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.99
H-Index - 332
ISSN - 1932-6203
DOI - 10.1371/journal.pone.0260871
Subject(s) - facial expression , disgust , sadness , affect (linguistics) , psychology , happiness , surprise , anger , cognitive psychology , set (abstract data type) , developmental psychology , social psychology , communication , computer science , programming language
The immediate detection and correct processing of affective facial expressions are one of the most important competences in social interaction and thus a main subject in emotion and affect research. Generally, studies in these research domains, use pictures of adults who display affective facial expressions as experimental stimuli. However, for studies investigating developmental psychology and attachment behaviour it is necessary to use age-matched stimuli, where it is children that display affective expressions. PSYCAFE represents a newly developed picture-set of children’s faces. It includes reference portraits of girls and boys aged 4 to 6 years averaged digitally from different individual pictures, that were categorized to six basic affects (fear, disgust, happiness, sadness, anger and surprise) plus a neutral facial expression by cluster analysis. This procedure led to deindividualized and affect prototypical portraits. Individual affect expressive portraits of adults from an already validated picture-set (KDEF) were used in a similar way to create affect prototypical images also of adults. The stimulus set has been validated on human observers and entail emotion recognition accuracy rates and scores for intensity, authenticity and likeability ratings of the specific affect displayed. Moreover, the stimuli have also been characterized by the iMotions Facial Expression Analysis Module, providing additional data on probability values representing the likelihood that the stimuli depict the expected affect. Finally, the validation data from human observers and iMotions are compared to data on facial mimicry of healthy adults in response to these portraits, measured by facial EMG (m. zygomaticus major and m. corrugator supercilii).

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here