Premium
The Role of Auditory and Visual Speech in Word Learning at 18 Months and in Adulthood
Author(s) -
Havy Mélanie,
Foroud Afra,
Fais Laurel,
Werker Janet F.
Publication year - 2017
Publication title -
child development
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 3.103
H-Index - 257
eISSN - 1467-8624
pISSN - 0009-3920
DOI - 10.1111/cdev.12715
Subject(s) - psychology , perception , word (group theory) , modality (human–computer interaction) , speech perception , cognitive psychology , stimulus modality , auditory perception , object (grammar) , visual perception , presentation (obstetrics) , sensory system , communication , linguistics , artificial intelligence , neuroscience , medicine , philosophy , computer science , radiology
Visual information influences speech perception in both infants and adults. It is still unknown whether lexical representations are multisensory. To address this question, we exposed 18‐month‐old infants ( n = 32) and adults ( n = 32) to new word–object pairings: Participants either heard the acoustic form of the words or saw the talking face in silence. They were then tested on recognition in the same or the other modality. Both 18‐month‐old infants and adults learned the lexical mappings when the words were presented auditorily and recognized the mapping at test when the word was presented in either modality, but only adults learned new words in a visual‐only presentation. These results suggest developmental changes in the sensory format of lexical representations.