Enabling the effective application of spatial auditory displays in modern flight decks
Author(s) -
John Towers
Publication year - 2016
Publication title -
queensland's institutional digital repository (the university of queensland)
Language(s) - English
Resource type - Dissertations/theses
DOI - 10.14264/uql.2016.171
Subject(s) - cockpit , computer science , auditory display , human–computer interaction , workload , flight simulator , head up display , crew , simulation , engineering , computer vision , aeronautics , operating system
Modern aircraft are fitted with sophisticated technologies that support or fully automate tasks that were once performed solely by the pilot. This means that pilots now spend much of their time monitoring instruments and managing the automation rather than manually manipulating flight controls. While modern flight decks are extremely safe, pilots do occasionally experience high visual workload conditions that may degrade their ability to effectively monitor flight instruments. This thesis describes the design and evaluation of spatial auditory displays that are intended to improve a pilot’s ability to monitor flight deck instruments under conditions of high visual workload. The aims broadly focus on developing design features that enable a pilot to perform head-up monitoring of an aircraft’s navigation readouts while concurrently attending to verbal dialogue. Four studies were undertaken to develop and evaluate an auditory display comprising spatially positioned sonifications that were encoded with information from multiple interrelated aircraft navigation displays. The auditory display also supported the spatial positioning of concurrent verbal communications that delivered navigation instructions. The studies were designed with four broad aims: (1) determine the sound localising performance for listeners using SLAB3D and its non-individual HRTF compared with other free field listening studies; (2) understand how supplementary auditory cues might improve localising accuracy and mitigate front-back hemisphere localising confusions; (3) develop an aircraft flight navigation auditory display that supplements existing visual readouts in order to facilitate increased head-up time and improved navigation accuracy; (4) determine the most accommodating spatial position for verbal navigation instructions that compete with concurrent sonifications for right cerebral hemisphere processing resources. The results support the use of concurrent spatial sonifications to convey interrelated aircraft navigation information normally attended to through visual displays. Building on established design guidelines, the experiments provide additional knowledge regarding techniques that enhance localising performance, such as through the use of supplementary sound localising cues. The auditory navigation display enabled participants to fly the aircraft more accurately and devote more head-up time to an out of flight deck visual search task. Verbal navigation instructions were found to be most effectively delivered to the left ear, or along the midsagittal plane, rather than the forward left, forward right, or right position. These findings demonstrate a significant left ear advantage in the processing of verbal navigation instructions while in conditions of competing attention with sonified spatial navigation data.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom