Premium
The intermodal representation of speech in newborns
Author(s) -
Aldridge Michelle A.,
Braga Erika S.,
Walton Gail E.,
Bower T. G. R.
Publication year - 1999
Publication title -
developmental science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.801
H-Index - 127
eISSN - 1467-7687
pISSN - 1363-755X
DOI - 10.1111/1467-7687.00052
Subject(s) - psychology , representation (politics) , product (mathematics) , speech perception , cognitive psychology , developmental psychology , process (computing) , audiology , communication , perception , computer science , neuroscience , medicine , geometry , mathematics , politics , political science , law , operating system
It has been proposed that speech is specified by the eye, the ear, and even the skin. Kuhl and Meltzoff (1984) showed that 4‐month‐olds could lip‐read to an extent. Given the age of the infants, it was not clear whether this was a learned skill or a by‐product of the primary auditory process. This paper presents evidence that neonate infants (less than 33 h) show virtually identical patterns of intermodal interaction as do 4‐month‐olds. Since they are neonates, it is unlikely that learning was involved. The results indicate that human speech is specified by both eye and ear at an age when built‐in structural sensitivities provide the most plausible explanation.