Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance - Institut de Recherche et Coordination Acoustique/Musique Access content directly
Conference Papers Year : 2022

Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance

Abstract

The main core of the workshop will be the presentation of the system GesTCom (acronym for Gesture Cutting Through Textual Complexity), which has been developed since 2014 in collaboration with IRCAM (interaction-son-musique-mouvement team). GesTCom is a sensor-based environment for the visualization, analysis and following of the pianist’s gestures in relation to the notation. It comprises four modules, implemented in the form of Max/MSP patches featuring the MuBu toolbox and connected to INScore scripts: a) a module for the synchronized recording of multimodal data of a performance ; b) a module for the reproduction and analysis of the data ; c) a module for the processing of the notation on the basis of the data ; d) a module for real-time gestural interaction with the notation.
Fichier principal
Vignette du fichier
2022_Xenakis22_workshop.pdf (1.5 Mo) Télécharger le fichier
Origin : Files produced by the author(s)

Dates and versions

hal-04420668 , version 1 (29-01-2024)

Licence

Attribution - NonCommercial - NoDerivatives

Identifiers

  • HAL Id : hal-04420668 , version 1

Cite

Pavlos Antoniadis, Jean-François Jego, Aurélien Duval, Frédéric Bevilacqua, Stella Paschalidou. Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis' piano performance. Xenakis 22: Centenary International Symposium, May 2022, Athens, Nafplio, France. ⟨hal-04420668⟩
6 View
6 Download

Share

Gmail Facebook X LinkedIn More