z-logo
open-access-imgOpen Access
Co‐speech gestures influence neural activity in brain regions associated with processing semantic information
Author(s) -
Dick Anthony Steven,
GoldinMeadow Susan,
Hasson Uri,
Skipper Jeremy I.,
Small Steven L.
Publication year - 2009
Publication title -
human brain mapping
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.005
H-Index - 191
eISSN - 1097-0193
pISSN - 1065-9471
DOI - 10.1002/hbm.20774
Subject(s) - gesture , psychology , inferior frontal gyrus , neurocomputational speech processing , superior temporal sulcus , comprehension , perception , functional magnetic resonance imaging , semantic memory , cognitive psychology , speech perception , superior temporal gyrus , brain activity and meditation , middle temporal gyrus , communication , neuroscience , computer science , cognition , electroencephalography , artificial intelligence , programming language
Everyday communication is accompanied by visual information from several sources, including co‐speech gestures, which provide semantic information listeners use to help disambiguate the speaker's message. Using fMRI, we examined how gestures influence neural activity in brain regions associated with processing semantic information. The BOLD response was recorded while participants listened to stories under three audiovisual conditions and one auditory‐only (speech alone) condition. In the first audiovisual condition, the storyteller produced gestures that naturally accompany speech. In the second, the storyteller made semantically unrelated hand movements. In the third, the storyteller kept her hands still. In addition to inferior parietal and posterior superior and middle temporal regions, bilateral posterior superior temporal sulcus and left anterior inferior frontal gyrus responded more strongly to speech when it was further accompanied by gesture, regardless of the semantic relation to speech. However, the right inferior frontal gyrus was sensitive to the semantic import of the hand movements, demonstrating more activity when hand movements were semantically unrelated to the accompanying speech. These findings show that perceiving hand movements during speech modulates the distributed pattern of neural activation involved in both biological motion perception and discourse comprehension, suggesting listeners attempt to find meaning, not only in the words speakers produce, but also in the hand movements that accompany speech. Hum Brain Mapp, 2009. © 2009 Wiley‐Liss, Inc.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here