z-logo
open-access-imgOpen Access
Virtual-Audio Aided Visual Search on a Desktop Display
Author(s) -
Clayton D. Rothwell,
Griffin D. Romigh,
Brian D. Simpson
Publication year - 2016
Language(s) - English
Resource type - Conference proceedings
DOI - 10.21785/icad2016.034
Subject(s) - computer science , visual search , set (abstract data type) , sensory cue , display size , virtual reality , audio visual , computer vision , artificial intelligence , salient , speech recognition , display device , multimedia , programming language , operating system
As visual display complexity grows, visual cues and alerts may become less salient and therefore less effective. Although the auditory system’s resolution is rather coarse relative to the visual system, there is some evidence for virtual spatialized audio to benefit visual search on a small frontal region, such as a desktop monitor. Two experiments examined if search times could be reduced compared to visual-only search through spatial auditory cues rendered using one of two methods: individualized or generic head-related transfer functions. Results showed the cue type interacted with display complexity, with larger reductions compared to visual-only search as set size increased. For larger set sizes, individualized cues were significantly better than generic cues overall. Across all set sizes, individualized cues were better than generic cues for cueing eccentric elevations (>± 8 °). Where performance must be maximized, designers should use individualized virtual audio if at all possible, even in small frontal region within the field of view.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom