z-logo
open-access-imgOpen Access
Cross-modal integration and plastic changes revealed by lip movement, random-dot motion and sign languages in the hearing and deaf
Author(s) -
Norihiro Sadato,
Tomohisa Okada,
Manabu Honda,
Ken-Ichi Matsuki,
Masaki Yoshida,
Kenichi Kashikura,
Wataru Takei,
Tetsuhiro Sato,
Takanori Kochiyama,
Yoshiharu Yonekura
Publication year - 2004
Publication title -
cerebral cortex
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.694
H-Index - 250
eISSN - 1460-2199
pISSN - 1047-3211
DOI - 10.1093/cercor/bhh210
Subject(s) - sign language , psychology , audiology , phonetics , planum temporale , american sign language , neuroscience , linguistics , medicine , philosophy
Sign language activates the auditory cortex of deaf subjects, which is evidence of cross-modal plasticity. Lip-reading (visual phonetics), which involves audio-visual integration, activates the auditory cortex of hearing subjects. To test whether audio-visual cross-modal plasticity occurs within areas involved in cross-modal integration, we used functional MRI to study seven prelingual deaf signers, 10 hearing non-signers and nine hearing signers. The visually presented tasks included mouth-movement matching, random-dot motion matching and sign-related motion matching. The mouth-movement tasks included conditions with or without visual phonetics, and the difference between these was used to measure the lip-reading effects. During the mouth-movement matching tasks, the deaf subjects showed more prominent activation of the left planum temporale (PT) than the hearing subjects. During dot-motion matching, the deaf showed greater activation in the right PT. Sign-related motion, with or without a lexical component, activated the left PT in the deaf signers more than in the hearing signers. These areas showed lip-reading effects in hearing subjects. These findings suggest that cross-modal plasticity is induced by auditory deprivation independent of the lexical processes or visual phonetics, and this plasticity is mediated in part by the neural substrates of audio-visual cross-modal integration.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom