RepMus team research on compositonal control of sound spatialization was initiated in 2001 . It is today related to a more general project on on the control of sound synthesis in computer-aided composition. Several complementary directions are open, listed in the following sections, most of them integrated in the OpenMusic computer-aided composition environment. This project is presently carried out and developed as a collaboration between the Music Representations team (J. Bresson) and CIRMMT (Marlon Schumacher / IDMIL, McGill University).
New objects in OpenMusic are used to represent 2D or 3D positions or trajectories (or more advanced spatial parameters such as orientation or directivity). These objects (BPC, 3DC, or 3D-trajectory…) can be created and manipulated thanks to graphical editors, but also geometrical operators and functions providing potentially unlimited complexity in the spatial control and spatial sound scene definition.
More importantly, they can be connected and determined in accordance with compositional processes developed in the computer-aided composition environment.
The representation of spatial sound scenes gathers sound sources description along with their trajectories and other spatial properties (orientation, directivity…) as well as spatial characteristics of the environment (reverb and other "room parameters").
The OM-Spat library allows to represent such sound scenes in the form of matrix structures where a set of spatial sources can be described structured together .
Spatial scenes are interpreted (including sampling the trajectories) and exported as sequences of data frames in SDIF files. The SDIF-formatted data can then be read and rendered by different compatible tools:
- Ircam Spat renderer (standalone executable tool distributed in the Ircam forum Research software suite). By Thibaut Carpentier (Ircam/CNRS, Espaces Acoustiques et Cognitifs).
- Spat-SDIF-Player: reading and OSC/SpatDIF real-time streaming of the spatialization data.
See also :
Initially dedicated to the control of sound synthesis, the OMChroma framework has been extended in order to support sound spatialization instruments as synthesis object classes. Spatialization parameters can therefore be set by processes developped in OpenMusic and related to other sound synthesis parameters and to the symbolic data and processes of a compositional framework. The OMPrisma library (by Marlon Schumacher) provides a number of predefined spatialization classes (multichannel, VBAP, ambisonics, etc.) as well as a set of functions for the generation and processing of spatial trajectories and parameters .
An original system in OMChroma ("class merger", embedded in the chroma-prisma function) allows to freely combine OMchroma sound synthesis classes (instruments) to the different spatial rendering classes provided in the OMPrisma library. The underlying algorithms (Csound instrument code) are parsed and merged in order to dynamically define new hybrid classes. Sounds can therefore be synthesized with complex spatial morphologies precisely controlled independently for each of their inner components (spatial sound synthesis) – see .
A standalone multitrack player (MultiPlayer) has also been developed and allows to read and decode spatialized sound files in various formats and according to variable output / speakers set-ups.
[See Marlon Schumacher's OMprisma pages for more info about OMprisma and the MultiPlayer, including pictures and sound examples]
-  G. Nouno, C. Agon (2002) Contrôle de la spatialisation comme paramètre musical, Actes des Journées d'Informatique Musicale, Marseille, France.
-  G. Nouno (2008) Some considerations on Brian Ferneyhough’s musical language through his use of CAC Part II – Spatialization as a musical parameter, in Bresson et al., The OM Composer’s Book. 2, Editions Delatour – IRCAM.
-  J. Bresson, C. Agon, M. Schumacher (2010) Représentation des données de contrôle pour la spatialisation dans OpenMusic. Actes de Journées d'Informatique Musicale, Rennes, France.
-  J. Bresson, M. Schumacher (2011) Representation and Interchange of Sound Spatialization Data for Compositional Applications. Proc. International Computer Music Conference, Huddersfield, UK.
-  M. Schumacher, J. Bresson (2010) Compositional Control of Periphonic Sound Spatialization. 2nd International Symposium on Ambisonics and Spherical Acoustics, Paris, France.
-  M. Schumacher, J. Bresson (2010) Spatial Sound Synthesis in Computer-Aided Composition. Organised Sound, 15(3). [revised - incl. errata]
-  J. Bresson (2012) Spatial Structures Programming for Music. Spatial Computing Workshop - SCW'12 (co-located w. the 11th Int. Conf. on Autonomous Agents and MultiAgent Systems - AAMAS'2012), Valencia, Spain.