Processus concurrents en informatique musicale

Vendredi 5 décembre 2008

Ircam, Salle I. Stravinsky
1, place I. Stravinsky 75004 Paris
Entrée libre dans la mesure des places disponibles.

Les présentations seront en anglais

Programme et résumés (PDF)

Programme de la journée

  • 14h30 - 15h00 Camilo Rueda - Temporal aspects of a Chu space semantics of CCP
  • 15h00 – 15h30 Frank Valencia - Specification and Verification in NTCC
  • 15h30 – 16h00 Mauricio Toro - NTCCRT: A Concurrent Constraint Framework for Signal Processing Languages
  • 16h15 - 16h45 Carlos Olarte - Universal Timed CCP: Applications to Musical Improvisation
  • 16h45 - 17h15 Arshia Cont - Collaborative and Competitive Musical Agents via Interactive Learning
  • Discussion (avec la salle et avec les chercheurs du groupe de recherche AVISPA en Colombie)

Abstracts

Camilo Rueda (Pontificia Universidad Javeriana, Colombia)

Temporal aspects of a Chu space semantics of CCP

Concurrent Constraint Process Calculi (CCP) are formal languages for modeling concurrent systems and verifying their properties. If these are to be used for musical systems models most likely will have to deal with temporal issues. Time is taken into account either explicitly by means of temporal constructs in the calculus or implicitly as an emergent intrinsic property of any computation in a calculus model. In this talk we explore the second option via the chu space semantics of the "eventual tell" and "atomic tell" variants of CCP.

Frank Valencia (LIX, Laboratoire d’Informatique de l’Ecole Polytechnique)

Specification and Verification in NTCC

NTCC can be used to describe temporal systems in various areas such as security, biology and in particular computer music. In this talk I will describe a few results involving the specification and verification of temporal properties of NTCC. We will also outline future direction on this subject involving the development of software tools for automatic verification and simulation of systems specified in NTCC.

Carlos Olarte (LIX, Laboratoire d’Informatique de l’Ecole Polytechnique)

Universal Timed CCP: Applications to Musical Improvisation

Universal timed concurrent constraint programming (utcc) is an extension of temporal CCP (tcc) aimed at modeling mobile reactive systems. The language counts with several reasoning techniques such as a symbolic semantics and a compositional semantics based on closure operators. Additionally, utcc processes can be regarded as formulae in first-order linear temporal logic (FLTL). In this talk we show how the abstraction operator "(ABS x;c)P" introduced in utcc allows us to neatly model an efficient graph structure for learning strings, called factor oracle (FO). The graph is implicitly constructed via constraints from a rather simple constraint system. We then model a music improvisation system composed of interactive agents that incrementally extend the graph.

Mauricio Toro (Pontificia Universidad Javeriana, Colombia / Ircam)

NTCCRT: A Concurrent Constraint Framework for Signal Processing Languages

We describe the design and implementation of NTCCRT. NTCCRT is a new interpreter for the NTCC formalism, capable of real-time interaction. We also present some applications in the multimedia interaction domain. An important feature of NTCCRT is generating externals (i.e., binary plugins) for Max/Msp and Pure Data. Using binary plugins generated by Ntccrt, we can control concurrency in those programming languages. The novelty of this approach is writing synchronization at user-lever, opposed to Max/Msp and Pure Data threading API's where C++ programming is required to write binary plugins for synchronization.

Arshia Cont (Ircam)

Collaborative and Competitive Musical Agents via Interactive Learning

In this talk, we introduce an adaptive and interactive learning system within an environment that comprises collaborative and competitive agents, and that has no internal, a priori, or formal knowledge about the structure of its environment. The automatic learning paradigm is a derivative of Reinforcement Learning techniques that is based on Active Learning and resampling methods. We showcase the algorithm within the paradigm of automatic improvisation and style imitation of a piece of music on symbolic data, as well as an additional architecture for real time detection and synchronization of audio to symbolic score. Each system is accompanied by concrete musical examples and applications. By the end of this talk, we hope to persuade the listener that: concurrent architectures reduces representational complexity, reduces learning complexity, increases speed of convergence and allows addressing complex problems in computer music with relative simple design.

Groupe de Recherche AVISPA

 


mamux/saisons/saison08-2008-2009/2008-12-05.txt · Dernière modification: 2011/02/11 12:01 par Jean Bresson