An integrated approach to motion and sound
Author(s) -
Hahn James K.,
Geigel Joe,
Lee Jong Won,
Gritz Larry,
Takala Tapio,
Mishra Suneil
Publication year - 1995
Publication title -
the journal of visualization and computer animation
Language(s) - English
Resource type - Journals
eISSN - 1099-1778
pISSN - 1049-8907
DOI - 10.1002/vis.4340060205
Subject(s) - computer science , timbre , animation , computer animation , motion (physics) , sound (geography) , computer graphics , human–computer interaction , computer graphics (images) , speech recognition , artificial intelligence , acoustics , musical , art , physics , visual arts
Until recently, sound has been given little attention in computer graphics and related domains of computer animation and virtual environments, although sounds which are properly synchronized to motion provide a great deal of information about events in the environment. Sounds are often not properly synchronized because the sounds and the phenomena that caused the sounds are not considered in an integrated way. In this paper, we present an integrated approach to motion and sound as it applies to computer animation and virtual environments. The key to this approach is synchronization by mapping the motion parameters to sound parameters so that the sound changes as a result of changes in the motion. This is done by representing sounds using a technique for functional composition analogous to the ‘shade trees’ which we call timbre trees. These timbre trees are used as a part of a sound description language that is analogous to scene description languages such as RenderMan. Using this methodology, we have produced convincing sound effects for a wide variety of animated scenes including the automatic generation of background music.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom