z-logo
open-access-imgOpen Access
An ALE meta‐analysis on the audiovisual integration of speech signals
Author(s) -
Erickson Laura C.,
Heeg Elizabeth,
Rauschecker Josef P.,
Turkeltaub Peter E.
Publication year - 2014
Publication title -
human brain mapping
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.005
H-Index - 191
eISSN - 1097-0193
pISSN - 1065-9471
DOI - 10.1002/hbm.22572
Subject(s) - neurocomputational speech processing , psychology , percept , multisensory integration , speech perception , contrast (vision) , sensory system , auditory cortex , neuroscience , perception , computer science , artificial intelligence
The brain improves speech processing through the integration of audiovisual (AV) signals. Situations involving AV speech integration may be crudely dichotomized into those where auditory and visual inputs contain (1) equivalent, complementary signals (validating AV speech) or (2) inconsistent, different signals (conflicting AV speech). This simple framework may allow the systematic examination of broad commonalities and differences between AV neural processes engaged by various experimental paradigms frequently used to study AV speech integration. We conducted an activation likelihood estimation metaanalysis of 22 functional imaging studies comprising 33 experiments, 311 subjects, and 347 foci examining “conflicting” versus “validating” AV speech. Experimental paradigms included content congruency, timing synchrony, and perceptual measures, such as the McGurk effect or synchrony judgments, across AV speech stimulus types (sublexical to sentence). Colocalization of conflicting AV speech experiments revealed consistency across at least two contrast types (e.g., synchrony and congruency) in a network of dorsal stream regions in the frontal, parietal, and temporal lobes. There was consistency across all contrast types (synchrony, congruency, and percept) in the bilateral posterior superior/middle temporal cortex. Although fewer studies were available, validating AV speech experiments were localized to other regions, such as ventral stream visual areas in the occipital and inferior temporal cortex. These results suggest that while equivalent, complementary AV speech signals may evoke activity in regions related to the corroboration of sensory input, conflicting AV speech signals recruit widespread dorsal stream areas likely involved in the resolution of conflicting sensory signals. Hum Brain Mapp 35:5587–5605, 2014 . © 2014 Wiley Periodicals, Inc .

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here