z-logo
open-access-imgOpen Access
An Investigation Into Customisable Automatically Generated Auditory Route Overviews for Pre-navigation
Author(s) -
Nida Aziz,
Tony Stockman,
Rebecca Stewart
Publication year - 2019
Language(s) - English
Resource type - Conference proceedings
DOI - 10.21785/icad2019.029
Subject(s) - usability , computer science , human–computer interaction , plan (archaeology) , process (computing) , participatory design , cognition , auditory display , auditory feedback , multimedia , engineering , psychology , mechanical engineering , parallels , archaeology , neuroscience , history , operating system
While travelling to new places maps are often used to determine the specifics of the route we want to follow. This helps us prepare for the journey by forming a cognitive model of the route in our minds. However, this process is predominantly visual and thus inaccessible to people who are either blind or visually impaired (BVI) or doing an activity where their eyes are otherwise engaged. This work explores effective methods of generating route overviews, which can create a similar cognitive model for routes, using audio. These overviews can help users plan their journey according to their preference and prepare for it in advance.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom