z-logo
Premium
Interaction of auditory and visual information in speech perception
Author(s) -
Dodd Barbara
Publication year - 1980
Publication title -
british journal of psychology
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.536
H-Index - 92
eISSN - 2044-8295
pISSN - 0007-1269
DOI - 10.1111/j.2044-8295.1980.tb01765.x
Subject(s) - psychology , perception , audiology , modality (human–computer interaction) , speech perception , auditory perception , reading (process) , visual perception , task (project management) , cognitive psychology , information processing , communication , computer science , neuroscience , linguistics , human–computer interaction , medicine , philosophy , management , economics
Two experiments investigated the role of stored auditory and visual information for unimodal speech perception tasks. The first experiment showed that hearing subjects performed better than deaf subjects on a lip‐reading task, possibly because they could supplement lip‐read stimuli with stored information derived from the auditory modality. The second experiment demonstrated that sighted subjects did not use stored visual information to supplement an auditory input when deleting mispronunciations, since their performance did not differ from that of congenitally blind subjects. The processing of visual (lip‐read) information in speech perception is discussed.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here