z-logo
open-access-imgOpen Access
The contribution of different cues of facial movement to the emotional facial expression adaptation aftereffect
Author(s) -
S de la Rosa,
Martin A. Giese,
HH Bülthoff,
C Curio
Publication year - 2013
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/13.1.23
Subject(s) - facial expression , psychology , movement (music) , cognitive psychology , emotional expression , adaptation (eye) , communication , face (sociological concept) , expression (computer science) , facial muscles , computer science , neuroscience , philosophy , programming language , aesthetics , social science , sociology
Probing emotional facial expression recognition with the adaptation paradigm is one way to investigate the processes underlying emotional face recognition. Previous research suggests that these processes are tuned to dynamic facial information (facial movement). Here we examined the tuning of processes involved in the recognition of emotional facial expressions to different sources of facial movement information. Specifically we investigated the effect of the availability of rigid head movement and intrinsic facial movements (e.g., movement of facial features) on the size of the emotional facial expression adaptation effect. Using a three-dimensional (3D) morphable model that allowed the manipulation of the availability of each of the two factors (intrinsic facial movement, head movement) individually, we examined emotional facial expression adaptation with happy and disgusted faces. Our results show that intrinsic facial movement is necessary for the emergence of an emotional facial expression adaptation effect with dynamic adaptors. The presence of rigid head motion modulates the emotional facial expression adaptation effect only in the presence of intrinsic facial motion. In a second experiment we show these adaptation effects are difficult to explain by merely the perceived intensity and clarity (uniqueness) of the adaptor expressions. Together these results suggest that processes encoding facial expressions are differently tuned to different sources of facial movements.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom