Visualising Sound: Localisation, Feature Analysis and Visualisation
Author(s) -
Jack Armitage,
Kyle Molleson,
Michael Battcock,
Chris Earnshaw,
D. Moore,
Kia Ng
Publication year - 2012
Publication title -
electronic workshops in computing
Language(s) - English
Resource type - Conference proceedings
ISSN - 1477-9358
DOI - 10.14236/ewic/eva2012.22
Subject(s) - computer science , human–computer interaction , visualization , context (archaeology) , sound (geography) , process (computing) , sound design , representation (politics) , multimedia , artificial intelligence , acoustics , paleontology , physics , politics , political science , law , biology , operating system
Sound is an integral medium of communication in society. Despite its influence in everyday interaction, the fundamental features of sound and its impact are not commonly understood and often not considered. Advanced dissection and analysis of sound is often used to aid technology’s understanding of its environment (e.g. in robotics and telecommunications). The aim of this project is to utilise sound information technology to enhance our understanding of sound and how we process it. Visualisation can provide a more accessible representation of complex sound analysis, and so it is investigated here in the context of musical performance and interactive installation. This cross-modal experience is informed by the phenomenon of synaesthesia and creative mapping to express its subjectivity by providing a user-unique experience. In addition to results and user evaluations, the paper concludes with plans for future development, focusing on the impact of the interactive installation on the user and ways in which the technology developed can be used in creative interaction with sound.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom