z-logo
open-access-imgOpen Access
How the Blind “See” Braille and the Deaf “Hear” Sign: Lessons from fMRI on the Cross-Modal Plasticity, Integration, and Learning
Author(s) -
Norihiro Sadato
Publication year - 2011
Publication title -
i-perception
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.64
H-Index - 26
ISSN - 2041-6695
DOI - 10.1068/ic827
Subject(s) - braille , auditory cortex , psychology , visual cortex , sensory substitution , multisensory integration , sensory system , crossmodal , reading (process) , perception , neuroplasticity , somatosensory system , cognitive psychology , visual perception , neuroscience , computer science , linguistics , philosophy , operating system
What does the visual cortex of the blind do during Braille reading? This process involves converting simple tactile information into meaningful patterns that have lexical and semantic properties. The perceptual processing of Braille might be mediated by the somatosensory system, whereas visual letter identity is accomplished within the visual system in sighted people. Recent advances in functional neuroimaging techniques have enabled exploration of the neural substrates of Braille reading (Sadato et al. 1996, 1998, 2002, Cohen et al. 1997, 1999). The primary visual cortex of early-onset blind subjects is functionally relevant to Braille reading, suggesting that the brain shows remarkable plasticity that potentially permits the additional processing of tactile information in the visual cortical areas. Similar cross-modal plasticity is observed by the auditory deprivation: Sign language activates the auditory cortex of deaf subjects (Neville et al. 1999, Nishimura et al. 1999, Sadato et al. 2004). Cross-modal activation can be seen in the sighted and hearing subjects. For example, the tactile shape discrimination of two dimensional (2D) shapes (Mah-Jong tiles) activated the visual cortex by expert players (Saito et al. 2006), and the lip-reading (visual phonetics) (Sadato et al. 2004) or key touch reading by pianists (Hasegawa et al. 2004) activates the auditory cortex of hearing subjects. Thus the cross-modal plasticity by sensory deprivation and cross-modal integration through the learning may share their neural substrates. To clarify the distribution of the neural substrates and their dynamics during cross-modal association learning within several hours, we conducted audio-visual paired association learning of delayed-matching-to-sample type tasks (Tanabe et al. 2005). Each trial consisted of the successive presentation of a pair of stimuli. Subjects had to find pre-defined audio-visual or visuo-visual pairs in a trial and error manner with feedback in each trial. During the delay period, MRI signal of unimodal and polymodal areas increased as cross-modal association learning proceeded, suggesting that cross-modal associations might be formed by binding unimodal sensory areas via polymodal regions. These studies showed that sensory deprivation and long- and short-term learning dynamically modify the brain organization for the multisensory integration

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom