z-logo
open-access-imgOpen Access
Audiotactile multisensory interactions in human information processing
Author(s) -
KITAGAWA NORIMICHI,
SPENCE CHARLES
Publication year - 2006
Publication title -
japanese psychological research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.392
H-Index - 30
eISSN - 1468-5884
pISSN - 0021-5368
DOI - 10.1111/j.1468-5884.2006.00317.x
Subject(s) - perception , multisensory integration , stimulus modality , crossmodal , psychology , sensory system , cognitive psychology , modalities , communication , affect (linguistics) , spatial analysis , neuroscience , cognitive science , computer science , visual perception , geography , social science , remote sensing , sociology
  The last few years has seen a very rapid growth of interest in how signals from different sensory modalities are integrated in the brain to form the unified percepts that fill our daily lives. Research on multisensory interactions between vision, touch, and proprioception has revealed the existence of multisensory spatial representations that code the location of external events relative to our own bodies. In this review, we highlight recent converging evidence from both human and animal studies that has revealed that spatially‐modulated multisensory interactions also occur between hearing and touch, especially in the space immediately surrounding the head. These spatial audiotactile interactions for stimuli presented close to the head can affect not only the spatial aspects of perception, but also various other non‐spatial aspects of audiotactile information processing. Finally, we highlight some of the most important questions for future research in this area.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here