z-logo
open-access-imgOpen Access
An Immersive Virtual Environment for Congruent Audio-Visual Spatialized Data Sonifications
Author(s) -
Samuel Chabot,
Jonas Braasch
Publication year - 2017
Language(s) - English
Resource type - Conference proceedings
DOI - 10.21785/icad2017.072
Subject(s) - spatialization , sonification , computer science , loudspeaker , virtual reality , human–computer interaction , microphone , computer graphics (images) , ambisonics , visualization , process (computing) , multimedia , artificial intelligence , acoustics , physics , sociology , anthropology , operating system
a parameter-mapping technique. Parameter-mapping is a popular framework for data sonification because it makes use of sound’s multidimensionality to convey changes and trends in data. Qualities of the produced sound, such as pitch and timbre, rhythm and tempo, and loudness, to name a few, are correlated to characteristics of the data [3]. Another function of a sonification system is the ability to harness spatialization of the produced sound. By doing so, the system alters the perceived location of auditory streams, and consequently employs the human ear’s capacity for attending to multiple audio cues simultaneously. Strategic positioning can strongly influence the conveyance of information and immersion within the data. There are a number of techniques to create spatialized sound, each with benefits and limitations.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom