Applications of Machine Learning in Computer-Aided Composition

Internship proposal — 2018

IRCAM Science and Technology of Music and Sound laboratory (STMS)

Supervisors:
- Jean Bresson (Music Representations team)
- Diemo Schwarz (Sound Music Movement Interaction / ISMM team)

We offer a Masters (or equivalent) student internship exploring relations between computer-aided music composition and machine learning techniques.

While machine learning (ML) applications in music technology are more and more developed in the fields of music information retrieval [1], automatic generation/improvisation systems [2][3], operational research in sound databases [4], or real-time gesture following and interaction [5], no much work has been concretely applied to computer-aided composition systems to date.

Computer-aided composition environments like OpenMusic [6] allow composers to generate musical structures using varied programming/computation techniques and formalisms [7]. OpenMusic is a visual domain-specific language (DSL) dedicated to music, developed at IRCAM and used by an important community of composers [8].

We propose to study and implement compositional scenarios where computer-aided composition systems will be combined to supervised learning tools and technologies. Using hybrid techniques involving for instance hidden Markov models and dynamic time warping (DTW), supervised ML algorithms allows to analyse flows of data and/or time series in order to classify them following varied criteria and small-sized learning data bases [9]. They will be used to explore, process and hopefully, better understand the concept of musical "gestures" (temporal descriptors, melodies, graphical input, etc.) in the compositional domain [10].

Technical aspects of the work will essentially consist in implementing bindings and links between existing technologies for real-time ML [11] and computer-aided composition [12]. The work will also be associated and applied to an ongoing compositional project in the context of IRCAM's musical research residencies.

Duration: 4/5 months.

Address: IRCAM — 1, place Igor Stravinsky, 75004 Paris

Contact/Application: Email to Jean Bresson, stating your motivations, skills and background (+ CV)

Candidates should be currently enrolled in a Masters or equivalent academic degree in computer science or other discipline related to music technology.



References

  • [1] llescas, P. R., Rizo, D., Iñesta J. M.: Learning to Analyse Tonal Music. Proceedings of the International Workshop on Machine Learning and Music, Helsinki, 2008.
  • [2] Briot, J.-P., Hadjeres, G., Pachet, F.: Deep Learning Techniques for Music Generation. Computational Synthesis and Creative Systems series, Springer, 2018.
  • [3] Assayag, G., Dubnov, S., Delerue, O.: Guessing the Composer’s Mind: Applying Universal Prediction to Musical Style. Proceedings of the International Computer Music Conference, Beijing, 1999.
  • [4] Esling, P., Agon, C.: Time-series Data Mining. ACM Computing Surveys, 45 (1), 2012.
  • [5] Françoise, J., Schnell, N. Bevilacqua, F.: A Multimodal Probabilistic Model for Gesture-based Control of Sound Synthesis. ACM MultiMedia, Barcelona, 2013.
  • [6] Bresson, J., Agon, C., Assayag, G.: OpenMusic. Visual Programming Environment for Music Composition, Analysis and Research. ACM MultiMedia (OpenSource Software Competition), Scottsdale, USA, 2011.
  • [7] Assayag, G.: Computer Assisted Composition Today. 1st Symposium on Music and Computers: Applications on Contemporary Music Creation, Esthetic and Technical aspects. Corfu, Greece, 1998.
  • [8] Bresson, J., Agon, C., Assayag, G. (Eds.) The OM Composer's Book (3 volumes). Editions Delatour / Ircam Centre Pompidou, 2006, 2008, 2016.
  • [9] Bevilacqua, F., Schnell, N., Rasamimanana, N., Zamborlin, B., Guédy, F.: Online Gesture Analysis and Control of Audio Processing, in Musical Robots and Interactive Multimodal Systems: Springer Tracts in Advanced Robotics Vol 74, 2011.
  • [10] Schumacher, M., Wanderley, M.: Integrating Gesture Data in Computer-aided Composition: A Framework for Representation, Processing and Mapping. Journal of New Music Research, 46(1), 2017.
  • [11] Schnell, N., Schwarz, D., Laralde, J., Borghesi, R.: PiPo, a Plugin Interface for Afferent Data Stream Processing Operators. International Symposium on Music Information Retrieval (ISMIR), Suzhou, 2017.
    — See also: https://github.com/Ircam-RnD/pipo-sdk
  • [12] Bresson, J., Bouche, D., Carpentier, T., Schwarz, D., Garcia, J.: Next-generation Computer-aided Composition Environment: A New Implementation of OpenMusic. Internaltional Computer Music Conference, Shanghai, 2017.
    — See also: https://openmusic-project.github.io/
 


bresson/enseignement/stage2018.txt · Dernière modification: 2018/01/25 12:35 par Jean Bresson