Premium
Crossmodal interactions in perception and learning
Author(s) -
Shams Ladan
Publication year - 2009
Publication title -
the faseb journal
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.709
H-Index - 277
eISSN - 1530-6860
pISSN - 0892-6638
DOI - 10.1096/fasebj.23.1_supplement.185.3
Subject(s) - crossmodal , perception , psychology , cognitive psychology , perceptual learning , stimulus modality , visual perception , multisensory integration , associative learning , sensory system , visual cortex , associative property , neuroimaging , modalities , neuroscience , task (project management) , sensory processing , social science , mathematics , management , sociology , economics , pure mathematics
Humans are generally considered as visual animals. Visual perception, however, can be strongly and radically altered by other modalities. We have found that sound can radically change visual perception, and our neuroimaging studies show that this alteration can occur at short latencies and as early as primary visual cortex. These findings together with a wealth of other recent findings have established that crossmodal interactions are ubiquitous in human perception and can occur at various levels of processing, including very early stages of sensory processing. Therefore, we asked whether crossmodal interactions play a role in perceptual learning. We compared visual training with auditory‐visual training for learning of a low‐level visual task, and discovered that auditory‐visual training accelerates and enhances performance in a visual task even in the absence of sound. Thus, it appears that crossmodal interactions play an important role in perceptual learning, but how are the crossmodal associations acquired themselves? Brain areas mediating multisensory learning, and characteristics of crossmodal associative learning and its relationship with unisensory associative learning will be discussed.