z-logo
open-access-imgOpen Access
Cross Modal Object-Based Attentional Guidance
Author(s) -
E. Bilger,
Sarah Shomstein
Publication year - 2011
Publication title -
journal of vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.126
H-Index - 113
ISSN - 1534-7362
DOI - 10.1167/11.11.135
Subject(s) - object (grammar) , cued speech , modality (human–computer interaction) , computer science , prioritization , perception , set (abstract data type) , psychology , cognitive psychology , modal , computer vision , artificial intelligence , communication , neuroscience , management science , economics , programming language , chemistry , polymer chemistry
Conclusions • Object-based attention is utilized cross-modally. • Auditory cues elicit object-based effects in visual targets; visual cues elicit object-based effects in auditory targets. • Attentional selection combines information from both the auditory and visual modalities to create a complete, multisensory representation of the external world. • Object-based attention, even cross-modally, acts as a default setting that can be overridden in the presence of an alternate, more effective strategy. Adapted a standard object-based attention paradigm (Egly, Driver, & Rafal, 1994):

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom