If motion sounds: Movement sonification based on inertial sensor data
Author(s) -
Heike Brock,
Gerd Schmitz,
Jan Baumann,
Alfred O. Effenberg
Publication year - 2012
Publication title -
procedia engineering
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.32
H-Index - 74
ISSN - 1877-7058
DOI - 10.1016/j.proeng.2012.04.095
Subject(s) - sonification , movement (music) , motion (physics) , inertial measurement unit , computer science , motion sensors , inertial frame of reference , accelerometer , motion capture , acoustics , artificial intelligence , human–computer interaction , physics , quantum mechanics , operating system
Within last years, movement sonification turned out to be an appropriate support for motor perception and motor control that can display physical motion in a very rich and direct way. But how should movement sonification be configured to support motor learning? The appropriate selection of movement parameters and their transformation into characteristic motion features is essential for an auditory display to become effective. In this paper, we introduce a real-time sonification framework for all common MIDI environments based on acceleration and orientation data from inertial sensors. Fundamental processing steps to transform motion information into meaningful sound will be discussed. The proposed framework of inertial motion capturing, kinematic parameter selection and possible kinematic acoustic mapping provides a basis for mobile real-time movement sonification which is a prospective powerful training tool for rehabilitation and sports and offers a broad variety of application possibilities
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom