Open Access
Dynamics of Multi-Sensory Tracking
Author(s) -
Johahn Leung,
Vincent Wei,
Simon Carlile
Publication year - 2011
Publication title -
i-perception
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.64
H-Index - 26
ISSN - 2041-6695
DOI - 10.1068/ic905
Subject(s) - stimulus (psychology) , stimulus modality , sensory system , computer science , auditory stimuli , psychology , audiology , computer vision , perception , cognitive psychology , neuroscience , medicine
These experiments examined the ability to track a moving target with our heads under various stimulus conditions and modalities. While previous studies [1,2] have concentrated on eye tracking within the frontal region; we extended the modes of tracking to include auditory and auditory-visual stimuli and in a significantly wider locus of space. Using a newly developed system that combines high fidelity virtual auditory space with a high speed LED strip we were able to examine head tracking behaviour in a 100° arc around a subject with velocities from ±20°/s to ±110 °/s. This allows us to derive behavioural norms for head tracking, compare differences in tracking ability across the modalities, and determine if cross modal facilitation occurs. Preliminary results show that subjects were better able to track a visual and bimodal target than an auditory one, as evident in the smaller average RMS error (visual = 4°, bimodal = 4.6°, auditory = 7°) and shorter lag (visual = 5.5°, bimodal = 5.9° and, auditory = 8.9°). Furthermore, tracking ability was influenced by stimulus speed, especially in the unimodal auditory situation where a significant increase in both RMS error and lag for speeds >80°/s was observed