EFFICAC(e)
The aim of this project is the development of tools exploring the relations between computation, time and interaction in computer-aided composition. It is based on OpenMusic and other technologies developed at IRCAM and at CNMAT.
We consider computer-aided composition out of its traditional "offline" paradigm, and try to integrate compositional processes in structured interactions with their external context. These interactions can take place during executions or performances, or at the early compositional stages (in the processes that lead to the creation of musical material). We focus in particular on a number of specific directions, such as :
- Reactive processes for computer-aided composition
- Control and management of time structures in computation and music
- Interactive control, visualisation and execution of sound synthesis and spatialization processes
- Rhythm and symbolic time structures
Several critical antagonisms in the domain of computer music are addressed in this project (signal versus symbolic approaches, off-line versus real-time…). By bridging high-level computer-aided composition systems with other disciplines and frameworks such as sound processing, spatialization and gestural integration, it includes control and interactions in abstract and expressive compositional models.
Software
- CACtus / oM#: A new-generation visual programming and computer-aided music composition framework.
- RQ: Supervised algorithms and interfaces for rhythm processing and quantification (Floren Jacquemard, Adrien Ycart, Jean Bresson)
- Trajectoires: A mobile application for the control of sound spatialization (Jérémie Garcia, Xavier Favory, Jean Bresson)
Musical research and applications
- Linking pen gestures to compositional processes (Jérémie Garcia, Philippe Leroux)
- Choreography and Composition of Internal Time (John MacCallum, Teoma Naccarato)
- Visualization, control and processing of sounds as 3D models (Savannah Agger, Jean Bresson, Thibaut Carpentier)
- Interactive-live analysis and visualization (Jean Bresson, Moreno Andreatta)
- Computer-Aided Composition of Musical Processes (Dimitri Bouche, Alex Chechile, Jérôme Nika, Jean Bresson)
Project info
Team
Jean Bresson (principal investigator)
Jérémie Garcia (Post-doc)
Dimitri Bouche (PhD UPMC)
Thibaut Carpentier (IRCAM-CNRS, Acoustic and Cognitive Spaces)
Florent Jacquemard (IRCAM-INRIA, Musical Representations / MuTant)
Diemo Schwarz (IRCAM, Sound Music Movement Interaction)
John MacCallum (CNMAT – UC Berkeley Department of Music)
Associate composers at CNMAT / UC Berkeley: Rama Gottfried, Matt Schumaker
Associate researcher at CIRMMT / McGill University: Marlon Schumacher
Artists / IRCAM residencies: John MacCallum, Teoma Naccarato, Geof Holbrook, Savannah Agger
Development and Linux support: Anders Vinjar
Funding: ANR (Young Researchers Programme / Exploratory and Emerging Research) [*]
Project duration: Oct. 2013 - Mar. 2017