Ceci est une ancienne révision du document !


PEPS I3A: Machine Learning and Computer-Aided Composition

[ EN ] | [ FR ]

Projet Exploratoire Premier Soutien (PEPS) 2018: Intelligence Artificielle et Apprentissage Automatique
Processus d'Apprentissage en Composition assistée par Ordinateur

As soon as the first computers appeared, contemporary music creation started to utilize the newly offered possibilities for computation and representation, to expand the range of compositional and sonic options, both enriching the power of expression for musicians and the musical experience of listeners. The beginnings of what would later be called computer music were inspired by artificial intelligence, the very idea of ​​programming machines capable of composing, rivalling the creativity of their creators. We might find the same idea reappearing also in recent projects, both ambitious and profiled. However, the notion of learning has rarely been exploited by composers in the perspective of being an aid to creation. In the field of computer-assisted composition, mistrust of one certain dispossession in the creative act has instead led researchers and composers to turn to constructivist approaches and other aspects of information technology, such as that of end-user programming (ie. giving the end-user of a system the ability to program the system themselves) and visual programming languages.

The aim of this exploratory project is to study the potential applications of machine learning techniques in computer-assisted music composition. In contrast to a more widespread approach of designing more or less self-contained creative systems, we are interested here in the potential contribution of AI and learning as an assistant to composition (or music analysis), in the accomplishment of tasks such as classification and processing of "musical gestures" (temporal descriptors, melodies, graphic inputs), aiding decisions or explorations of solution spaces from operational search algorithms, or the generation of structure and musical parameters from sample databases. The applications envisaged can be linked to a range of stages and activities in a compositional workflow: rhythmic analysis and quantification, composition by recomposition/concatenation of motifs etc., for which machine learning will propose new ways of controlling, generating and understanding musical structures.

STMS Laboratory: IRCAM/CNRS/Sorbonne Université (Paris, France)
Coordinator: Jean Bresson
Participants: Diemo Schwarz (Sound-Music-Movement Interaction team), Nicolas Obin (Sound Alysis and Synthesis team), Jérôme Nika (Music Representations team), Paul Best (Master's internship, RepMus / ISMM teams), Alireza Farhang (IRCAM musical research residency), Anders Vinjar (composer, Oslo), Marlon Schumacher (Institut für Musikwissenschaft und MusikinformatikKarlsruhe, Hochschule für Musik Karlsruhe).

paco_rapport_2018.pdf

Links and events

• Demo SMC 2019 – Sound and Music Computing conference
May 29-31 2019, Malaga, Spain.
A. Vinjar, J. Bresson: OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic

• CBMI 2018 – International Conference on Content-Based Multimedia Indexing
Sept. 4-6, 2018, La Rochelle, France.
P. Best, J. Bresson, D. Schwarz: Musical Gesture Recognition Using Machine Learning and Audio Descriptors.

• Workshop @SMC'18: Music Composition and Creative Interaction with Machine Learning
15th Sound and Music Computing conference, July 4-7 2018, Limassol, Cyprus.

• MUME 2018 – 6th International Workshop on Musical Metacreation
International Conference on Computational Creativity – ICCC'18, June 25-26, 2018, Salamanca, Spain.
J. Bresson, P. Best, D. Schwarz, A. Farhang: From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition.

• Traces de l'expressivité : partition de flux de données gestuelles pour les œuvres interdisciplinaires
Alireza Farhang: IRCAM musical research residency

• Applications of Machine Learning in Computer-Aided Composition
Paul Best: Master's intesrnship – supervision Jean Bresson and Diemo Schwarz
With support from IRCAM "Unités Projet Innovation" (UPI) program.

• Guided improvisation and computer-aided composition
Victoire Siguret (ENS Lyon): Undergraduate internship (L3) – supervision Jean Bresson and Jérôme Nika

Software

• OM-XMM: connection between the OM computer-aided composition environment and the XMM library for motion learning and recognition.
See on GitHub

• OM-DYCI2: connection between the OM computer-aided composition environment and theDYCI2 library for guided improvisation.
Voir sur GitHub

• OM-AI: Musical applications of ML algorithms in Common Lisp/OM.
See on GitHub

 


paco/home-en.1557848681.txt.gz · Dernière modification: 2019/05/14 17:44 par Jean Bresson