
Multisensory integration of vision and touch in nonspatial feature discrimination tasks
Author(s) -
WADA YUICHI
Publication year - 2010
Publication title -
japanese psychological research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.392
H-Index - 30
eISSN - 1468-5884
pISSN - 0021-5368
DOI - 10.1111/j.1468-5884.2009.00418.x
Subject(s) - coactivation , stimulus (psychology) , psychology , multisensory integration , tactile stimuli , visual perception , cognitive psychology , communication , perception , neuroscience , sensory system , electromyography
Multisensory integration of nonspatial features between vision and touch was investigated by examining the effects of redundant signals of visual and tactile inputs. In the present experiments, visual letter stimuli and/or tactile letter stimuli were presented, which participants were asked to identify as quickly as possible. The results of Experiment 1 demonstrated faster reaction times for bimodal stimuli than for unimodal stimuli (the redundant signals effect (RSE)). The RSE was due to coactivation of figural representations from the visual and tactile modalities. This coactivation did not occur for a simple stimulus detection task (Experiment 2) or for bimodal stimuli with the same semantic information but different physical stimulus features (Experiment 3). The findings suggest that the integration process might occur at a relatively early stage of object‐identification prior to the decision level.