Différences

Cette page vous donne les différences entre la révision choisie et la version actuelle de la page.

Lien vers cette vue

paco:home-en [2018/07/10 14:10]
Jean Bresson [Links and events]
paco:home-en [2022/06/17 09:15] (Version actuelle)
Jean Bresson [PEPS I3A: Machine Learning and Computer-Aided Composition]
Ligne 10: Ligne 10:
 // Processus d'Apprentissage en Composition assistée par Ordinateur // // Processus d'Apprentissage en Composition assistée par Ordinateur //
  
-As soon as the first computers appeared, contemporary music creation +Contemporary music creation has long been utilizing computation and digital representations to expand the range of compositional and sonic possibilitiesenriching both the power of expression for musicians and the musical experience of listeners.  The beginnings of what would later be called "computer musicwere inspired by artificial intelligence, the very idea of ​​programming machines capable of composing, challenging the creativity of their creators. We find similar ideas in recent ambitious and profiled projects around music and AI. However, the notion of "machine learninghas rarely been exploited by composers in the perspective of being  an aid to creation.  In the field of computer-assisted composition, mistrust or fear of certain dispossession in the creative act has instead led researchers and composers to turn to constructivist approaches and other aspects of information technology, such as end-user programming (ie. giving the end-user of a system the ability to program the system themselves) and visual programming languages.
-started to utilize the newly offered possibilities for computation and +
-representation, to expand the range of compositional and sonic options, +
-both enriching the power of expression for musicians and the musical +
-experience of listeners.  The beginnings of what would later be called +
-computer music were inspired by artificial intelligence, the very idea +
-of ​​programming machines capable of composing, rivalling the creativity +
-of their creators.  We might find the same idea reappearing also in +
-recent projects, both ambitious and profiled. +
-However, the notion of learning has rarely been exploited by composers +
-in the perspective of being an aid to creation.  In the field of +
-computer-assisted composition, mistrust of one certain dispossession in +
-the creative act has instead led researchers and composers to turn to +
-constructivist approaches and other aspects of information technology, +
-such as that of end-user programming (ie. giving the end-user of a +
-system the ability to program the system themselves) and visual +
-programming languages.+
  
-The aim of this exploratory project is to study the potential applications  +The aim of this exploratory project is to study the potential applications of machine learning techniques in computer-assisted music composition. In contrast to a more widespread approach of designing more or less self-contained creative systems, we are interested here in the potential contribution of AI and machine learning as an assistance to composition (or music analysis), in the accomplishment of tasks such as classification and processing of "musical gestures" (temporal descriptors, melodies, graphic inputs), aiding decisions or explorations of solution spaces from operational search algorithms, or the generation of structure and musical parameters from sample databases. The applications envisaged can be linked to a range of stages and activities 
-of machine learning techniques in computer-assisted music +in a compositional workflow: rhythmic analysis and quantification, composition by recomposition/concatenation of motifs etc., for which machine learning will enable new avenues for controlling, generating and understanding musical structures.
-composition.  In contrast to a more widespread approach of designing +
-more or less self-contained creative systems, we are interested here in +
-the potential contribution of AI and learning as an assistant to +
-composition (or music analysis), in the accomplishment of tasks such as +
-classification and processing of "musical gestures" (temporal +
-descriptors, melodies, graphic inputs), aiding decisions or explorations +
-of solution spaces from operational search algorithms, or the generation +
-of structure and musical parameters from sample databases.  The +
-applications envisaged can be linked to a range of stages and activities +
-in a compositional workflow: rhythmic analysis and quantification, +
-composition by recomposition/concatenation of motifs etc., for which +
-machine learning will propose new ways of controlling, generating and +
-understanding musical structures.+
  
  
Ligne 49: Ligne 20:
 **Participants:** **Participants:**
 [[http://imtr.ircam.fr/imtr/Diemo_Schwarz|Diemo Schwarz]] (Sound-Music-Movement Interaction team),  [[http://imtr.ircam.fr/imtr/Diemo_Schwarz|Diemo Schwarz]] (Sound-Music-Movement Interaction team), 
-[[http://recherche.ircam.fr/anasyn/obin/|Nicolas Obin]] (Sound Alysis and Synthesis team)  +[[http://recherche.ircam.fr/anasyn/obin/|Nicolas Obin]] (Sound Alysis and Synthesis team), 
-[[http://repmus.ircam.fr/nika|Jérôme Nika]] (Music Representations team)+[[http://repmus.ircam.fr/nika|Jérôme Nika]] (Music Representations team),
 Paul Best (Master's internship, RepMus / ISMM teams),  Paul Best (Master's internship, RepMus / ISMM teams), 
 [[http://www.alirezafarhang.com/|Alireza Farhang]] (IRCAM musical research residency),  [[http://www.alirezafarhang.com/|Alireza Farhang]] (IRCAM musical research residency), 
 [[https://avinjar.no/|Anders Vinjar]] (composer, Oslo),  [[https://avinjar.no/|Anders Vinjar]] (composer, Oslo), 
-[[http://www.music.mcgill.ca/marlonschumacher/|Marlon Schumacher]] (Institut für Musikwissenschaft und MusikinformatikKarlsruhe, Hochschule für Musik Karlsruhe) +[[http://www.music.mcgill.ca/marlonschumacher/|Marlon Schumacher]] (Institut für Musikwissenschaft und MusikinformatikKarlsruhe, Hochschule für Musik Karlsruhe).
  
 +{{:paco:paco_rapport_2018.pdf|}}
  
 ===== Links and events ===== ===== Links and events =====
 +
 +**• Demo SMC 2019 -- [[http://smc2019.uma.es/|Sound and Music Computing conference]]**\\
 +May 29-31 2019, Malaga, Spain.\\
 +A. Vinjar, J. Bresson: //[[https://hal.archives-ouvertes.fr/hal-02126847|OM-AI: A Toolkit to Support AI-Based Computer-Assisted Composition Workflows in OpenMusic]]//
 +
 +**• CBMI 2018 -- [[http://cbmi2018.univ-lr.fr/|International Conference on Content-Based Multimedia Indexing]]**\\
 +Sept. 4-6, 2018, La Rochelle, France.\\
 +P. Best, J. Bresson, D. Schwarz: //[[https://hal.archives-ouvertes.fr/hal-01839050|Musical Gesture Recognition Using Machine Learning and Audio Descriptors]]//.
 +
 +**• Workshop @SMC'18: [[.:workshop-smc|Music Composition and Creative Interaction with Machine Learning]]**\\
 +[[http://smc2018.cut.ac.cy/|15th Sound and Music Computing conference]],  July 4-7 2018, Limassol, Cyprus.
  
 **• MUME 2018 -- [[http://musicalmetacreation.org/workshops/mume-2018/|6th International Workshop on Musical Metacreation]]**\\ **• MUME 2018 -- [[http://musicalmetacreation.org/workshops/mume-2018/|6th International Workshop on Musical Metacreation]]**\\
Ligne 65: Ligne 47:
 |From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition.]]// |From Motion to Musical Gesture: Experiments with Machine Learning in Computer-Aided Composition.]]//
  
-**• Workshop @SMC'18: [[.:workshop-smc|Music Composition and Creative Interaction with Machine Learning]]**\\ +**• Traces de l'expressivité : high-level score of data-stream for interdisciplinary works**\\ 
-[[http://smc2018.cut.ac.cy/|15th Sound and Music Computing conference]],  July 4-7 2018, Limassol, Cyprus. +Alireza Farhang: [[https://www.alirezafarhang.com/post/traces-of-expressivity-data-stream|IRCAM musical research residency]] 
- +
-**• CBMI 2018 -- [[http://cbmi2018.univ-lr.fr/|International Conference on Content-Based Multimedia Indexing]]**\\ +
-Sept. 4-6, 2018, LAr Rochelle, France.\\ +
-P. Best, J. Bresson, D. Schwarz: //Musical Gesture Recognition Using Machine Learning and Audio Descriptors//. +
- +
-**• Traces de l'expressivité : partition de flux de données gestuelles pour les œuvres interdisciplinaires**\\ +
-Alireza Farhang: [[https://www.ircam.fr/person/alireza-fahrang/|IRCAM musical research residency]] +
  
 **• Applications of Machine Learning in Computer-Aided Composition**\\ **• Applications of Machine Learning in Computer-Aided Composition**\\
-Paul Best: [[http://repmus.ircam.fr/bresson/enseignement/stage2018|Master's intesrnship — supervision Jean Bresson and Diemo Schwarz]]\\+Paul Best: Master's intesrnship – supervision Jean Bresson and Diemo Schwarz\\
 With support from IRCAM "Unités Projet Innovation" (UPI) program. With support from IRCAM "Unités Projet Innovation" (UPI) program.
  
-**• OM-XMM:** connection between the [[https://openmusic-project.github.io/|o7]] computer-aided composition environment and the [[http://ircam-rnd.github.io/xmm/|XMM]] library for motion learning and recognition.\\+**• Guided improvisation and computer-aided composition**\\ 
 +Victoire Siguret (ENS Lyon): Undergraduate internship (L3) – supervision Jean Bresson and Jérôme Nika\\ 
 + 
 +===== Software ===== 
 + 
 +**• OM-XMM:** connection between the [[https://openmusic-project.github.io/|OM]] computer-aided composition environment and the [[http://ircam-rnd.github.io/xmm/|XMM]] library for motion learning and recognition.\\
 => [[https://github.com/openmusic-project/om-xmm|See on GitHub]]  => [[https://github.com/openmusic-project/om-xmm|See on GitHub]] 
  
-**• OM-AI:** (work-in-progress) musical applications of ML algorithms in Lisp/OM.\\ +**• OM-DYCI2:** connection between the [[https://openmusic-project.github.io/|OM]] computer-aided composition environment and the[[https://github.com/DYCI2|DYCI2]] library for guided improvisation.\\ 
-=> [[https://github.com/andersvi/omai|See on GitHub]] +=> [[https://github.com/DYCI2/om-dyci2|Voir sur GitHub]]  
 + 
 +**• OM-AI:** Musical applications of ML algorithms in Common Lisp/OM.\\ 
 +=> [[https://github.com/openmusic-project/omai|See on GitHub]] 
  
  
 


paco/home-en.txt · Dernière modification: 2022/06/17 09:15 par Jean Bresson