Merging symbolic, physical and virtual spaces: Augmented reality for Iannis Xenakis’ Evryali for piano solo
Abstract
The proposed paper will present interactive systems for the visualization and optimization of extreme score-based piano performance. The systems are founded on an ecological theory of embodied interaction with complex piano notation, under the title embodied navigation. The theory has materialized in a modular, sensor-based environment for the analysis, processing and real-time control of notation through multimodal recordings, called GesTCom. The motion capture modeling is based on a one-shot learning Hidden Markov Model developed at IRCAM and called Gesture Follower. At a later stage, mixed reality applications have been developed on the basis of existent visualization methodologies for motion capture, seeking to create a virtual concert environment.
Origin | Files produced by the author(s) |
---|