Physicalizing Time Through Orientational Metaphors for Generating Rhythmic Gestures
Author(s) -
Greg Corness,
Kristin Carlson
Publication year - 2018
Publication title -
electronic workshops in computing
Language(s) - English
Resource type - Conference proceedings
ISSN - 1477-9358
DOI - 10.14236/ewic/eva2018.54
Subject(s) - gesture , rhythm , computer science , premise , performing arts , musical , human–computer interaction , multimedia , artificial intelligence , visual arts , aesthetics , linguistics , art , philosophy
Possibilities for cross-disciplinary interactive performance continue to grow as new tools are developed and adapted. Yet, the qualitative aspects of cross-disciplinary interaction have not advanced at the same rate. We suggest that new models for understanding gesture in different media will support the development of nuanced interaction for interactive performance. We have explored this premise by considering models for generating musical rhythmic gestures that enable implicit interaction between the gestures of a dancer and the generated music. We create and implement a model for generating dynamic rhythmic gestures that flow in, around, or out of goal points. Goal points can be layered and quantized to a meter, providing the rhythmic structure expected in music, while the figurations enable the generated rhythms to flow with the performer responding to the more qualitative aspects of performer.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom