z-logo
open-access-imgOpen Access
The Use of Non-Speech Sounds in Non-Visual Interfaces to the MS-Windows GUI for Blind Computer Users
Author(s) -
Helen Petrie,
Sarah Morley
Publication year - 1998
Publication title -
electronic workshops in computing
Language(s) - English
Resource type - Conference proceedings
ISSN - 1477-9358
DOI - 10.14236/ewic/ad1998.22
Subject(s) - computer science , human–computer interaction , perception , screen reader , user interface , task (project management) , interface (matter) , speech recognition , graphical user interface , visually impaired , psychology , engineering , programming language , parallel computing , operating system , systems engineering , bubble , neuroscience , maximum bubble pressure method
Two studies investigated the use of non-speech sounds (auditory icons and earcons) in non-visual interfaces to MS-Windows for blind computer users. The first study presented sounds in isolation and blind and sighted participants rated them for their recognisability, and appropriateness of the mapping between the sound and the interface object/event. As a result, the sounds were revised and incorporated into the interfaces. The second study investigated the effects of the sounds on user performance and perceptions. Ten blind participants evaluated the interfaces, and task completion time was significantly shorter with the inclusion of sounds, although interesting effects on user perceptions were found.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom