Premium
Intermodal Perception of Happy and Angry Expressive Behaviors by Seven‐Month‐Old Infants
Author(s) -
Soken Nelson H.,
Pick Anne D.
Publication year - 1992
Publication title -
child development
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.103
H-Index - 257
eISSN - 1467-8624
pISSN - 0009-3920
DOI - 10.1111/j.1467-8624.1992.tb01661.x
Subject(s) - psychology , facial expression , perception , preference , affect (linguistics) , biological motion , expression (computer science) , developmental psychology , motion (physics) , time perception , cognitive psychology , communication , neuroscience , artificial intelligence , computer science , programming language , economics , microeconomics
2 studies were conducted to examine the roles of facial motion and temporal correspondences in the intermodal perception of happy and angry expressive events. 7‐month‐old infants saw 2 video facial expressions and heard a single vocal expression characteristic of one of the facial expressions. Infants saw either a normally lighted face (fully illuminated condition) or a moving dot display of a face (point light condition). In Study 1, one woman expressed the affects vocally, another woman expressed the affects facially, and what they said also differed. Infants in the point light condition showed a reliable preference for the affectively concordant displays, while infants in the fully illuminated condition showed no preference for the affectively concordant display. In a second study, the visual and vocal displays were produced by a single individual on one occasion and were presented to infants 5 sec out of synchrony. Infants in both conditions looked longer at the affectively concordant displays. The results of the 2 studies indicate that infants can discriminate happy and angry affective expressions on the basis of motion information, and that the temporal correspondences unifying these affective events may be affect‐specific rhythms.