Control of spatialization and spatial sound synthesis

RepMus team research on compositonal control of sound spatialization was initiated in 2001 [1][2] and is today related to computer-aided composition research on sound synthesis. Several complementary directions are open, most of them integrated in the OpenMusic environment. This work was partly carried out and developed as a collaboration between the Music Representations team (Jean Bresson) and CIRMMT (Marlon Schumacher / IDMIL, McGill University).

Generation and representation of spatial parameters

The 3DC editor in OpenMusic

New objects in OpenMusic were developed to represent 2D or 3D positions or trajectories (or more advanced spatial parameters such as orientation or directivity). These objects (BPC, 3DC, or 3D-trajectory…) can be created and manipulated thanks to graphical editors, but also geometrical operators and functions providing potentially unlimited complexity in the spatial control and spatial sound scene definition. More importantly, they can be connected to the compositional processes developed in the computer-aided composition environment.

Spatial scene representation, storage and transfer

OM-Spat: representation / storage / rendering of spatial sound scenes in OpenMusic.

The representation of spatial sound scenes gathers sound sources description along with their trajectories and other spatial properties (orientation, directivity…) as well as spatial characteristics of the environment (reverb and other "room parameters").

The OM-Spat library allows to represent such sound scenes in the form of matrix structures where a set of spatial sources can be described structured together [3][4].

Spatial scenes are interpreted (including sampling the trajectories) and exported as sequences of data frames in SDIF files. The SDIF-formatted data can then be read and rendered by different compatible tools:

  • Ircam Spat renderer – a standalone executable tool distributed along with library package.
  • Spat-SDIF-Player: a standalone Max app for reading and OSC/SpatDIF streaming of the spatialization data.

Spat-SDIF-Player

Spat-SDIF-Player bridges offline compositional tools (where the advanced/formalized musical process are developed) and real-time environments (where sound rendering occurs during concerts and performances).

It is based on the MuBu multi-track container, and provides simple playback and cueing function (play / pause / loop / stop, speed control…).

The messages are formatted and comply with the SpatDIF specification. They can be received in Max, for instance, using the udpreceive object set to this incoming port.

Note: In recent Spat 4, the spatdif-to-spat object allows to convert incoming SpatDIF messages into Spat-formatted messages for spat.oper, spat.viewer or spat.spat.

Spat-SDIF-PlayerSpat-SDIF-Player: reading / visualization / streaming of SDIF-formatted spatialization data

See also:

SpatDIF-Viewer

SpatDIF-Viewer is a rendering tool to display trajectories from incoming OSC/SpatDIF messages.

If the port is set to the Spat-SDIF-Player output port, if the initialization data have correctly be sent (or re-sent), it allows to view and navigate the scene in a 3D viewer.

Notes:

  • Only one same incoming OSC port number can be open at a time. SpatDIF-Viewer can therefore not work simultaneously with the Max patch above and the same incoming messages.
  • SpatDIF-Viewer only displays position data and ignores all other SpatDIF message streams (except the initialization messages).

[download SpatDIF-Viewer]

OMPrisma: Integration of spatialization in the OMChroma framework

Spatial sound synthesis with OMChroma/OMPrisma. Initially dedicated to the control of sound synthesis, the OMChroma framework has been extended in order to support sound spatialization instruments as synthesis object classes. Spatialization parameters can therefore be set by processes developped in OpenMusic and related to other sound synthesis parameters and to the symbolic data and processes of a compositional framework. The OMPrisma library (by Marlon Schumacher) provides a number of predefined spatialization classes (multichannel, VBAP, ambisonics, etc.) as well as a set of functions for the generation and processing of spatial trajectories and parameters [5][6].

An original system in OMChroma ("class merger", embedded in the chroma-prisma function) allows to freely combine OMchroma sound synthesis classes (instruments) to the different spatial rendering classes provided in the OMPrisma library. The underlying algorithms (Csound instrument code) are parsed and merged in order to dynamically define new hybrid classes. Sounds can therefore be synthesized with complex spatial morphologies precisely controlled independently for each of their inner components (spatial sound synthesis) – see [6].

A standalone multitrack player (MultiPlayer) has also been developed and allows to read and decode spatialized sound files in various formats and according to variable output / speakers set-ups.

References / Publications

 


bresson/projects/spatialisation.txt · Dernière modification: 2018/09/18 10:45 par Jean Bresson