Artificial Faces Predict Gaze Allocation in Complex Dynamic Scenes
Author(s) -
Lara Rösler,
Marius Rubo,
Matthias Gamer
Publication year - 2019
Publication title -
frontiers in psychology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.947
H-Index - 110
ISSN - 1664-1078
DOI - 10.3389/fpsyg.2019.02877
Subject(s) - gaze , psychology , schematic , predictive power , cognitive psychology , artificial intelligence , face (sociological concept) , computer science , social science , philosophy , epistemology , electronic engineering , sociology , engineering
Both low-level physical saliency and social information, as presented by human heads or bodies, are known to drive gaze behavior in free-viewing tasks. Researchers have previously made use of a great variety of face stimuli, ranging from photographs of real humans to schematic faces, frequently without systematically differentiating between the two. In the current study, we used a Generalized Linear Mixed Model (GLMM) approach to investigate to what extent schematic artificial faces can predict gaze when they are presented alone or in competition with real human faces. Relative differences in predictive power became apparent, while GLMMs suggest substantial effects for real and artificial faces in all conditions. Artificial faces were accordingly less predictive than real human faces but still contributed significantly to gaze allocation. These results help to further our understanding of how social information guides gaze in complex naturalistic scenes.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom