Différences

Cette page vous donne les différences entre la révision choisie et la version actuelle de la page.

Lien vers cette vue

efficace:events [2015/04/14 11:33]
Jean Bresson [Séminaire Nouveaux espaces de la notation musicale]
efficace:events [2017/03/28 11:23] (Version actuelle)
Jean Bresson [Conferences]
Ligne 1: Ligne 1:
  
  
 +====== Workshops and Seminars ======
  
 +=== Spat Workshop @UC Berkeley ===
  
-====== News and events ======+{{:efficace:events:spat-1.png?200 |}}
  
-{{ :efficace:logo-w2.png?150|}}+An introduction to the Ircam Spat library, a software suite dedicated to real-time sound spatialization and artificial reverberation.  
 +For composers, computer musicians, sound engineers, sound designers, video artists or scientists with interest in spatial sound.
  
-=== EFFICACe workshop at CIRMMT (Montréal) ===+**CNMAT main room**\\  
 +**1750 Arch Street -- Berkeley, CA**\\ 
 +May 23-24-25, 2016 -- 2pm-6pm\\ 
 +\\ 
 +=> **[[:bresson:cnmat:spat-workshop-2016|More info]]**
  
-**[[.:events:workshop-cirmmt|Interactivity in Music Composition and Performance]]** 
  
-**May 5th, 2015.\\ +=== HCI Meeting Series @UC Berkeley ===
-CIRMMT (Centre for Interdisciplinary Research in Music Media and Technology)\\ +
-Schulich School of Music -- McGill University**+
  
 +{{:efficace:bid-cnmat.png?200 |}}
 +A series of meetings and activities around human-computer interactions in music 
 +at the **[[http://cnmat.berkeley.edu|Centre for New Music and Audio Technologies (CNMAT)]]** and the **[[http://bid.berkeley.edu/|Berkeley Institute of Design (BiD)]]**,
 +University of California, Berkeley in May-June, 2016.
  
 +=> **[[.:events:workshop-berkeley|Event page]]**
  
-===== 2015 =====+{{:efficace:ircam-logo1.png?nolink&60|}} 
 +{{:efficace:bid-logo.gif?70|}} 
 +{{:efficace:cnmat_logo.gif?nolink&65|}}
  
 +\\
 +------
 +\\
 +=== Première journée de rencontres MusICAL ===
  
-{{ :efficace:omnotation.png?nolink&120|}} +**Interaction Calcul Algorithmique Langages appliqués à la Musique**
-=== Séminaire "Nouveaux espaces de la notation musicale" ===+
  
-Groupe de travail de l'AFIM (Association Française d'Informatique Musicale)+**Amiens,  14 décembre 2015**
  
-**INA-GRM - Vendredi 6/02/2015**+=> [[http://repmus.ircam.fr/musical/|MusICAL (Réseau National des Systèmes Complexes)]]\\
  
-Avec Karim Barkati (Weezic)Dominique Fober (Grame)Robert Piéchaud, Filipe Lopes et Pierre Couprie (IReMus).+  * Présentation: **Interactions et modèles d'analyse en CAO: Supervision d'un modèle formel pour la manipulation des structures rythmiques**\\ Jean BressonFlorent JacquemartAdrien Ycart.
  
 +\\
 +------
 +\\
 +=== Interactivity in Music Composition and Performance ===
  
-=> **Représentations interactives de données musicales dans OpenMusic** (J. Bresson).+{{:efficace:events:a832-view.jpg?180 |}}
  
-/Nous présenterons un système introduisant la notion de réactivité dans les programmes visuels d'OpenMusic. Avec ce système, des évènements externes ou des éditions/actions locales de l'utilisateur peuvent entrainer des chaines de réactions dans les programmes, conduisant à la mise à jour des structures et éditeurs de données qu'ils contiennent. OpenMusic peut alors devenir un environnement interactif de traitement et visualisation de données musicales provenant d'entrées temps-réel. */+**<color darkred>EFFICACe international workshop</color>**
  
-;;# +May 5th, 2015.\\ 
-=> [[http://notation.afim-asso.org/doku.php/evenements/2015-02-06-etude-notation3|Page du séminaire]]\\ +CIRMMT (Centre for Interdisciplinary Research in Music Media and Technology)\\ 
-;;#+Schulich School of Music -- McGill University, Montréal (CA)
  
  
 +=> **[[.:events:workshop-cirmmt|Workshop page]]**
 +
 +\\
 +\\
 +\\
 +\\
 +\\
 +\\
 +\\
 +\\
 +\\
 ------ ------
 \\ \\
-=== CCRMA Guest Colloquium : Scheduling & Time Structures in Computer­ Assisted Composition === 
  
-{{ :efficace:karma_logo.png?300|}}+=== Interactive Music and NotationLondon'14 ===
  
-**Dimitri Bouche**\\+[[http://www.afim-asso.fr|AFIM]] work-group [[http://notation.afim-asso.org|"Les nouveaux espaces de la notation musicale"]]. 
 +== Workshop @NIME'14: Interactive Music Notation and Representation ==
  
-**CCRMA (Center for Computer Research in Music and Acoustics)\\ +**Goldsmiths University, London, 30/06/2014**
-Department of Music, Stanford University.**+
  
-**Wed 02/4, 2015** +{{ :efficace:nime14.png?nolink&80|}}
- +
-Music composers’ work with computers is generally divided in the two distinct stages of composition and performance. Each stage requires specific methodology and softwarecomputer-­aided composition involves time and high computation capabilities to produce complex musical scores, while performance and live rendering require reactive environments with precise real‐time constraints. Contemporary music composers permanently challenge these kinds of established categories, using unusual objects and behaviors in their music or considering the variations in performance as an actual part of the composition. +
- +
-Dimitri's work consists in extending the properties of a computer­ assisted composition software (OpenMusic) to meet these challenges. It mostly means working on defining a scheduling architecture for computer-­assisted composition software, but also on providing high level tools for the composers and efficient display of potential complex and non-­deterministic data. Even if there are software products that embed interesting scheduling capabilities, no general model has been well established. +
- +
-After a brief historical survey on the evolution of computer­ assisted composition and computer music systems (and Lisp-based music systems), Dimitri will present his current work and developments.+
  
 +The Interactive Music Notation and Representation Workshop co-located with the [[http://www.nime2014.org/|14th International Conference on New Interfaces for Musical Expression]] gathered artists, researchers and application developers to compare views and the needs in the field of music notation and representation inspired by contemporary practicesin interactive and live music, including representational forms emerging from live coding. 
  
 ;;# ;;#
-=> [[https://ccrma.stanford.edu/events/dimitri-bouche-scheduling-time-structures-in-computer-assisted-composition|CCRMA Event info]]\\+=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-nimew|Notation Workshop @NIME]]\\ 
 +NIME'14, Goldsmiths, University of London\\ 30 June 2014, 9h30-13h
 ;;# ;;#
  
  
 +== Seminar on Music Notation and Computation @ C4DM ==
  
-===== 2014 =====+**Queen Mary University, London, 30/06/2014**
  
 +{{ :efficace:qmul.png?nolink&150|}}
  
- +Hosted by the [[http://c4dm.eecs.qmul.ac.uk/|Center for Digital Music]] at Queen Mary Universitythis seminar focused on representations and computational approach for generatingoptmizing and processing music notation in computer music systems.
-=== Choreography and Composition of Internal Time === +
- +
-{{ :efficace:miptlflyer_02-2.jpeg?260|}} +
- +
- +
-**John MacCallum**, composition & **Teoma Naccarato**, chorégraphie\\  +
-avec Bekah Edie, danse +
- +
- +
-Lecture-demonstration  +
- +
-**Vendredi 12/12/2014**\\ +
-12h00-13h00\\ +
-Studio 5, Ircam +
- +
-As a part of the musical research residency at IRCAM for Fall 2014, composer John MacCallum and choreographer Teoma Naccarato are engaged in early research and development for an evening-length production In this performance with 12 dancers and 12 musicians, ECG data from the dancers is used as an underlying clock for each musician, in order to inform a poly-temporal composition for live ensemble with electronics. +
- +
-This lecture demonstration will discuss the design of a facile choreographic and compositional environment for real-time interaction with biosensorsas well as questions regarding internal and external perception and interaction.  MacCallum and Naccarato will outline ongoing technical and performance-based experiments that integrate biosensors with breath, movement, and environmental stimuli to intervene in and transform in their collaborative creative process. +
- +
-The session will also feature a short performance study for dancer Bekah Edie with live electronics, which explores correlations between cardiac, respiratory, and nervous activity, in order to impact intentional arcs in heart activity - and therefore musical tempo - over time.+
  
 ;;# ;;#
-=> [[http://medias.ircam.fr/x39b837|video]]+=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-qmul|C4DM Notation Seminar]]\\ 
 +Center For Digital Music, Queen Mary University of London\\ 30 June 2014, 15h-17h30
 ;;# ;;#
  
---------- 
- 
- 
-=== Days of research seminars and workshops in Oslo, Norway === 
- 
-{{ :efficace:nmh-logo.png?nolink&260|}} 
-{{ :efficace:bevegelseslab-320.jpg?nolink&200|}} 
- 
-**University of Oslo, Department of Musicology\\ 
-Norwegian Academy of Music\\ 
-24-26/11 2014** 
- 
-The EFFICACe project was presented along with Ircam's last research and developments on spatial audio at the Department of Musicology of the University of Oslo during a workshop day arganised at the fourmMs laboratory. 
- 
-Two dans of courses and workshop on OpenMusic were held under the coordination of prof. Asbjørn Schaathun at the Norwegian Academy of Music. 
- 
-;;# 
-=> [[http://www.hf.uio.no/imv/english/research/news-and-events/events/guest-lectures-seminars/2014/ircam.html|webpage (IMV, 11/24)]]\\ 
-;;# 
 \\ \\
 ------ ------
 +\\
  
 === IRCAM : Groupe de Recherche Rythme/Temps Musical === === IRCAM : Groupe de Recherche Rythme/Temps Musical ===
Ligne 133: Ligne 122:
  
 \\ \\
- 
 ------ ------
-=== Anders Vinjar: Composition in the Flesh - Physical modelling in composition === +\\
-{{ :efficace:handa.jpeg?200|}}+
  
-**Séminaires Recherche et Création, IRCAM\\  +===  Lisp for music technology Workshop at the European Lisp Symposium ELS'2014 === 
-Avec la participation du RNSC / Séminaire MaMuX\\ +
-27/10 2014 (12h-13h)**+
  
 +**IRCAM, 6/05/2014** 
  
-Physical models used in synthesis and performance (eg. gesture-control) have proved very valuable.  Some of the principal benefits - linear control of complex systems, intuitive behavior, easy one-to-many mappings - represent large potentials in composition-applications as well. However, compared to efforts in synthesis and performance little work has gone into testing applications of physical models to composition. Some of the motivations, prerequisites and assumptions for applying physical modelling to composition-tasks are discussed, and some possible gains are suggested. An implementation of a general CAC-environment charged with physical-modelling capabilities is suggested, combining OpenMusic and ODE in a modular way, providing a powerful and flexible environment for working with physical models in composition tasks.+{{ :efficace:lambda.png?nolink&80|}}
  
-;;# +The power and expressivity of Lisp make it valuable language to musicians for exploring high-level compositional processesand this language is fundamental support for computer-aided composition research and creation
-=> [[http://medias.ircam.fr/xe394c4|VOIR LA VIDEO]] +In this session we will present an overview of our current projects and developments, and discuss the challenges, issues and perspectives for using Lisp in new music technologies such as digital signal processing and real-time systems.
-;;# +
-\\ +
------- +
-=== John MacCallum and Teoma Naccarato: Heart rate data from contemporary dancers === +
-**Séminaires Recherche et Création, IRCAM\\ 13/10 2014 (12h-13h)**\\ +
- +
-The composer John MacCallum and choreographer Teoma Naccarato propose collaborative project that examines the use of real-timeheart rate data from contemporary dancers to drive polytemporal composition for instrumental ensemble with live electronics.\\ +
-In collaboration with the Musical Representations Team as part of the EFFICACe Project.+
  
 ;;# ;;#
-[[http://medias.ircam.fr/x1ed4fe|[video]]]+=> [[http://www.european-lisp-symposium.org/editions/2014/|7th  European Lisp Symposium]]\\ 
 +IRCAM, Paris. 5-6 May, 2014
 ;;# ;;#
  
 \\ \\
 ------ ------
 +\\
  
-=== Papers at the joint ICMC-SMC Conference 2014 === +=== Time, rhythm and arithmetics ===
-{{ :efficace:icmc14.png?70|}}+
  
-**AthensGreece14-20/09/2014**+**MaMuX seminar seriesIRCAM6/12/2013**
  
-Two EFFICACe papers will be presented at the joint [[http://www.icmc14-smc14.net/|ICMC-SMC 2014]] conference: +{{ :efficace:time.png?nolink&200|}} 
-  * JGarcia, PLeroux, JBresson: **pOM: Linking Pen Gestures to Computer-Aided Composition Processes**.  + 
-  * D. Bouche J. Bresson, S. Letz**Programmation and Control of Faust Sound Processing in OpenMusic.**+In the framework of the Ircam [[http://repmus.ircam.fr/mamux/|MaMuX]] seminar series we propose this [[http://repmus.ircam.fr/mamux/saisons/saison13-2013-2014/2013-12-06|special session]] dedicated to formal/mathematical theory of time and rhythm in music composition and performance. **Invited speakers**: Philippe Riot, Alain Le Méhauté (Federal University of Kazan, Russia), Jean-Louis Giavitto (IRCAM - CNRS), Karim Haddad (Composer, IRCAM).
  
 ;;# ;;#
-[[http://www.icmc14-smc14.net/|ICMC-SMC 2014]]\\ +=> http://repmus.ircam.fr/mamux/saisons/saison13-2013-2014/2013-12-06
-Athens, Greece\\ +
-14-20 Sept., 2014\\+
 ;;# ;;#
  
------- 
  
  
  
-=== 1° Colóquio Franco-Brasileiro de Criação e Análise Musicais com Suporte Computacional === 
-**Universities of Campinas and São Paolo, Brasil, 18-29/08, 2014** 
  
-{{ :efficace:coloquiobrasil.jpg?160|}} 
  
-{{:efficace:nics.png?60 |}} //O Núcleo Interdisciplinar de Comunicação Sonora (NICS), em parceria com a Faculdade de Engenharia Elétrica e Computação (FEEC), o Instituto de Artes (IA) e Centro de Centro de Integração, Documentação e Difusão Cultural (CIDDIC) da Unicamp, junto à USP e à UNESP, com apoio da FAPESP realizam o 1° Colóquio Franco‑Brasileiro de Análise e Criação Musicais com Suporte Computacional. 
-Os professores das universidades brasileiras acima citadas, junto aos pesquisadores da equipe do RepMus do IRCAM de Paris (Gérard Assayag, Jean Bresson e Moreno Andreatta), promovem uma série de palestras, mesas‑redondas e aulas que abordam o universo da formalização musical e modelos computacionais dirigidos à análise e à criação musicais.// 
  
-  * 19/08: J. Bresson - **Compositing with sounds / composing sounds** 
-  * 20/08: J. Bresson - **Models, programs and interactions in OM: Tools for interactive creation and analysis** 
  
-;;# 
-=> [[http://www.nics.unicamp.br/coloquio/|French-Brazilian Colloquium on Computer Aided Music Creation and Analysis]]\\ 
-Campinas/São Paolo, Brasil\\ 
-Aug. 18-29th. 
-;;# 
  
-\\ 
-=== Concert: Livre Digital - Fronteiras Musicais: Tecnologias que desafi(n)am os sentidos === 
-**Barão Geraldo, SP, Brasil, 28/08/2014** 
  
-{{ :efficace:cartaz_concerto.jpg?130|}} 
  
-//Encerrando 1o Colóquio Franco-Brasileiro de Análise e Criação Musicais com Suporte Computacional, Livre Digital será uma noite dedicada à exploração de fronteiras musicais: improvisação fora de qualquer padrão, explorando novas sonoridades e processos. Tecnologias que desafi(n)am os sentidos, algoritmos que dialogam com instrumentistas humanos. Cabe ao espectador desvendar quem é quem. Dilatar essas fronteiras, espreitar e vivenciar os seus limites. Abrir seus ouvidos e deixar-se surpreender.//\\ 
  
-=> Live performance of **Hamiltonian Song (M. Andreatta) with interactive real-time analysis and visualization in OpenMusic**. 
  
-;;# +====== Invited &Guests ======
-[[http://www.almanaquecafe.com.br/events/livre-digital-fronteiras-musicais/|Almanaque Café]]\\ +
-Barão Geraldo (SP) Brasil\\ +
-28/08/2014 às 21h  +
-;;#+
  
-{{ :efficace:concerto-hamiltonian-brasil.jpg?300 |}} 
  
-\\ +=== OM-Darwin : composing with genetic algorithms in OpenMusic === 
-------+/*{{ :efficace:handa.jpeg?200|}}*/
  
 +**Geof Holbrook : Séminaires Recherche et Création, IRCAM\\ 
 +Résidence en recherche musicale IRCAM\\
 +22/06 2015 (12h-13h)**
  
-=== IEEE Symposium on Visual Languages and Human-Centric Computing ===+OM-Darwin is an OpenMusic library for composing with genetic algorithms. The focus is on flexibility, such that the composer can freely combine patches that generate musical material from a set of parameters, and patches that measure the suitability of potential solutions. On the generative side, the library offers a collection of built-in musical objects, as well as an intuitive system for designing customized constructs. It also implements a rule-based system familiar to users of constraint-solving systems, which can be used in conjunction with user-defined measuring functions. Multiple genetic algorithms can run in parallel inside a maquette, and will interact with each other based on their temporal relationships. This presentation will include a live performance by Anne-Elisabeth DeCologne of excerpts of a new work for contrabass and electronics, with a demonstration of the patches used to create the piece.
  
-**Melbourne, Australia, 28/07-1/08 2014**+[[http://forumnet.ircam.fr/fr/event/seminaire-recherche-et-creation-fin-de-residence-en-recherche-musicale-de-geof-holbrook/]]
  
-{{ :efficace:IEEE.jpg?80|}}+=== Composition in the Flesh - Physical modelling in composition === 
 +/*{{ :efficace:handa.jpeg?200|}}*/
  
-J. Bresson: **Reactive visual programs for computer-aided music composition**.\\  +**Anders Vinjar : Séminaires Recherche et Création, IRCAM\\  
-Paper presented at the IEEE Symposium on Visual Languages and Human-Centric Computing.\\+Avec la participation du RNSC / Séminaire MaMuX\\ 
 +27/10 2014 (12h-13h)**
  
-Abstract: //We present a new framework for reactive programming in OpenMusic, a visual programming language dedicated to computer-aided music composition. 
-We highlight some characteristics of the programming and computation paradigms, and describe the implementation of a hybrid system merging demand-driven and event-driven evaluation models in this environment.// 
  
 +Physical models used in synthesis and performance (eg. gesture-control) have proved very valuable.  Some of the principal benefits - linear control of complex systems, intuitive behavior, easy one-to-many mappings - represent large potentials in composition-applications as well. However, compared to efforts in synthesis and performance little work has gone into testing applications of physical models to composition. Some of the motivations, prerequisites and assumptions for applying physical modelling to composition-tasks are discussed, and some possible gains are suggested. An implementation of a general CAC-environment charged with physical-modelling capabilities is suggested, combining OpenMusic and ODE in a modular way, providing a powerful and flexible environment for working with physical models in composition tasks.
 ;;# ;;#
-IEEE [[https://sites.google.com/site/vlhcc2014/|VL/HCC 2014]]\\ +=> [[http://medias.ircam.fr/xe394c4|[video]]]
-Melbourne, Australia\\ +
-July 28th, Aug. 1st, 2014.+
 ;;# ;;#
  
----- 
  
-=== Rhythm Trees at the Vienna Summer of Logic === 
  
-**Vienna, Austria, 13/07/2014**+=== A Californian Composer's Dive Into Real-Time Technology and Computer-Aided Composition ===
  
-{{ :efficace:vsl14.png?nolink&80|}}+**Matthew Schumaker / Séminaires Recherche et Création, IRCAM\\ 26/05/2014** 
  
-Presentation at the [[http://verify.rwth-aachen.de/IFIP-WG1.6/|International Federation for Information Processing (IFIP) Working Group 1.6]] on Term Rewritingby Florent Jacquemard.+Invited talk by **Matthew Schumaker**, PhD candidate in Music Composition at the University of CaliforniaBerkeley.\\
  
-**AbstractRhythm Tree Rewriting (Florent JacquemardJean Bresson and Pierre Donat-Bouillud)**. In traditional western music notation, the durations of notes are expressed hierarchically by recursive subdivisions. This led naturally to tree representation of melodies widespread in systems for assistance to music authoringWe will see how term rewriting techniques can be applied to computer music problems, in particular to the problem of rhythm quantization: the transcription of a sequence of dates (real time values) into a music score. Besides the question of rhythm representationan interesting problem in this context is the design and concise description of large rewrite rule sets and decision problems for these descriptions+ 
 +Matt Schumaker will present his work with live electronics and computer-aided composition in three recent pieces//In threatening possibilities// (2012)for two singers, large ensemble and electronics; //Nocte Lux// (2013), for two cellos, bass and electronics; and a work-in-progress for soprano and orchestra. The first two were undertaken at UC Berkeley's Center for New Music and Audio Technologies (CNMAT) and the third is being created during an ongoing Berkeley Ladd fellowship in Paris. This work engages live-electronics with real-time voice and instrument processing using Ircam objects for granular synthesis and variety of other signal processing Live electronics are employed to create a hybrid instrument: a custom-designed, computer-interactive sampler keyboard that performs with continuously changingphysical modelling sound sets created in Modalys.  Computer-aided composition tools in OpenMusic are employed to investigate personal ideas of virtual thematism and interpolation between musical lines.   
 +Seen as case studies of a Californian composer using computer music tools and programming environments, these works bring up questions concerning the prevalent composer/programmer paradigm.  How does a person who is first a composer and only secondarily and of necessity a programmerengage fruitfully with the vast possibilities and intricacies afforded by such tools?  Given the realities of a one-person development team, can a composer's vision sometimes be served by adopting a hybrid approach combining aspects of the ideal flexibility and musicality of the "totally live" real-time approach with substantial "time-deferred," pre-recorded elements? [...]
  
 ;;# ;;#
-=> [[http://vsl2014.at/meetings/|Vienna Summer of Logic]]\\ 9-24 July 2014+=> [[http://www.ircam.fr/139.html?event=1307|Séminaires Recherche et Création de l'IRCAM]]\\ 
 +Salle Stravinsky\\ 
 +26 mai 2014, 12h-13h
 ;;# ;;#
  
------- 
  
-=== Interactive Music and Notation: London'14 ===+=== Choreography and Composition of Internal Time ===
  
-[[http://www.afim-asso.fr|AFIM]] work-group [[http://notation.afim-asso.org|"Les nouveaux espaces de la notation musicale"]]. +/*{{ :efficace:miptlflyer_02-2.jpeg?300|}}*/
-== Workshop @NIME'14: Interactive Music Notation and Representation ==+
  
-**Goldsmiths University, London, 30/06/2014**+**Vendredi 12/12/2014**\\ 
 +12h00-13h00 -Studio 5, Ircam
  
-{{ :efficace:nime14.png?nolink&80|}} 
  
-The Interactive Music Notation and Representation Workshop co-located with the [[http://www.nime2014.org/|14th International Conference on New Interfaces for Musical Expression]] gathered artistsresearchers and application developers to compare views and the needs in the field of music notation and representation inspired by contemporary practicesin interactive and live musicincluding representational forms emerging from live coding+Lecture-demonstration**John MacCallum**, composition & **Teoma Naccarato**, chorégraphie\\  
 +avec Bekah Edie, danse 
 + 
 +//As a part of the musical research residency at IRCAM for Fall 2014composer John MacCallum and choreographer Teoma Naccarato are engaged in early research and development for an evening-length production.  In this performance with 12 dancers and 12 musicians, ECG data from the dancers is used as an underlying clock for each musician, in order to inform a poly-temporal composition for live ensemble with electronics.\\ 
 +This lecture demonstration will discuss the design of a facile choreographic and compositional environment for real-time interaction with biosensors, as well as questions regarding internal and external perception and interaction.  MacCallum and Naccarato will outline ongoing technical and performance-based experiments that integrate biosensors with breathmovement, and environmental stimuli to intervene in and transform in their collaborative creative process.\\ 
 +The session will also feature a short performance study for dancer Bekah Edie with live electronics, which explores correlations between cardiac, respiratory, and nervous activity, in order to impact intentional arcs in heart activity - and therefore musical tempo - over time.//
  
 ;;# ;;#
-=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-nimew|Notation Workshop @NIME]]\\ +[[http://medias.ircam.fr/x39b837|[video]]]
-NIME'14, Goldsmiths, University of London\\ 30 June 2014, 9h30-13h+
 ;;# ;;#
  
  
-== Seminar on Music Notation and Computation @ C4DM == 
  
-**Queen Mary University, London, 30/06/2014** 
  
-{{ :efficace:qmul.png?nolink&150|}} 
  
-Hosted by the [[http://c4dm.eecs.qmul.ac.uk/|Center for Digital Music]] at Queen Mary University, this seminar focused on representations and computational approach for generating, optmizing and processing music notation in computer music systems. 
  
-;;# 
-=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-qmul|C4DM Notation Seminar]]\\ 
-Center For Digital Music, Queen Mary University of London\\ 30 June 2014, 15h-17h30 
-;;# 
  
-==== ==== +====== Invited talks & Presentations ======
------+
  
-=== Séminaire Agorantic: Culture, Patrimoines, Sociétés Numériques ===  
  
-**Université d'Avignon, 13/06/2014** 
  
-{{ :efficace:agorantic.png?nolink&200|}}+=== IRCAM @ BU CNM April 25-29, 2016 ===
  
-[[http://blogs.univ-avignon.fr/sfr-agorantic/|Structure Fédérative de Recherche (S)FR Agorantic]] +**Boston University, USA**
-  +
-**Programmation et représentation musicale interactive en composition assistée par ordinateur**\\ J. Bresson (IRCAM - UMR STMS).\\+
  
-//Résumé// +  * 26/04/2016 -- 11:30am:\\ **Computer-Aided Composition using OpenMusic**\\ //OpenMusic is a visual programming environment dedicated to musical data processing and generationused by a large community of users to carry out varied aspects of their compositional processesWe will present the main features and characteristics of this environmentas well as a number applications in contemporary music production.//
-/*La recherche en composition assistée par ordinateur menée à l’Ircam vise à lier des techniques et formalismes informatiques aux processus de composition musicale. Au delà du développement d’outils, il s’agit d’accompagner la démarche créatrice du compositeur en lui donnant accès aux ressources de calcul et de représentation offertes par l’informatique dans un objectif exploratoire et artistique. Dans cette perspective, de nombreux langages de programmation dédiés ont été développés et utilisés par les compositeurs, pour produire ou transformer des sons (//digital signal processing//), pour programmer des interactions (systèmes live « temps-réel »), ou pour construire des structures musicales suivant des démarches algorithmiques ou calculatoires (« composition assistée par ordinateur »).*/ +
-L’utilisation de langages de programmation pour la composition favorise une vision duale des structures musicales, où les objets sont considérés comme des processus déployés dans le temps, et où inversement les processus deviennent des représentations puissantes et expressives des objets musicaux. Cette vision « programmatique » de la création musicale n’exclut cependant pas les interfaces et interactions utilisateur (compositeur/programmeur)qui constituent un volet primordial des recherches que nous menons actuellement. Au delà des interactions entre machines et instruments de musique, qui sont de plus en plus fréquentes et riches dans les pratiques musicales contemporaines, il existe également une interaction forte entre le compositeur et les programmes informatiques qu’il produit ou utilise, que ce soit dans la phase de composition, ou même lors de la performance d’une œuvre.  +
-Les langages visuels favorisent cette interaction et proposent des représentations mêlant les aspects fonctionnels ou procéduraux, et les interventions « manuelles » sur données mises en jeu dans les processus, accessibles sous des formes familières aux musiciens (partitions, formes d’ondes sonores, courbes de contrôle, etc.)  +
-Nous illustrerons ces différents points avec la présentation de l’environnement OpenMusic en essayant de décrire les spécificités de cet environnement du point de vue des paradigmes musicaux et calculatoires sous-tendus par cette notion de composition assistée par ordinateurainsi que nos recherches et développements actuels dans le cadre du projet ANR EFFICACe, centrés sur les aspects interactifs mis en jeu en support des processus créatifs, dans les contextes de création, de performance ou d’analyse musicale.+
  
-{{:efficace:documents:bresson-agorantic14-slides.pdf|Supports de la présentation [PDF]}}+  * 27/04/2016 -- 11:00am:\\ **From symbolic music processing to spatial audio – Research directions in computer-aided composition**\\ //Computer-aided composition processes traditionally deal with symbolic musical material, manipulated algorithmically and rendered using classical score representations and parameters (pitches, rhythms, etc.) On the other hand, sound processing and spatialization generally run in real-time interactive environments. Research and developments carried out during the past 15 years in computer-aided composition systems have aimed at bridging these different fields. This idea will be illustrated through the presentation of musical research projects carried out in the OpenMusic environment, with a particular focus on recent applications integrating the control of sound spatialization in compositional processes.//
  
 ;;# ;;#
-=> [[http://blogs.univ-avignon.fr/sfr-agorantic/2014/04/14/seminaire-agorantic-12-et-13-juin-2014-amphi-2e08/|Programme complet du séminaire]].\\ +=> http://www.bu.edu/cfa/ircam/public-presentations/
-Université d'Avignon\\ Site Sainte Marthe, Amphi 2E08.\\ +
-12-13 juin 2014 +
 ;;# ;;#
  
------ +\\
-=== Matthew Schumaker: A Californian Composer's Dive Into Real-Time Technology and Computer-Aided Composition ===+
  
-**Séminaires Recherche et Création, IRCAM\\ 26/05/2014**  
  
-Invited talk by **Matthew Schumaker**, PhD candidate in Music Composition at the University of California, Berkeley.\\ 
  
 +=== CCRMA Guest Colloquium ===
  
-Matt Schumaker will present his work with live electronics and computer-aided composition in three recent pieces: //In threatening possibilities// (2012), for two singers, large ensemble and electronics; //Nocte Lux// (2013), for two cellos, bass and electronics; and a work-in-progress for soprano and orchestra. The first two were undertaken at UC Berkeley's Center for New Music and Audio Technologies (CNMAT) and the third is being created during an ongoing Berkeley Ladd fellowship in Paris. This work engages live-electronics with real-time voice and instrument processing using Ircam objects for granular synthesis and a variety of other signal processing.  Live electronics are employed to create a hybrid instrument: a custom-designed, computer-interactive sampler keyboard that performs with continuously changing, physical modelling sound sets created in Modalys.  Computer-aided composition tools in OpenMusic are employed to investigate personal ideas of virtual thematism and interpolation between musical lines.   +**Center for Computer Research in Music and AcousticsStanford UniversityUSA**
-Seen as case studies of a Californian composer using computer music tools and programming environmentsthese works bring up questions concerning the prevalent composer/programmer paradigm.  How does a person who is first a composer and only secondarily and of necessity a programmer, engage fruitfully with the vast possibilities and intricacies afforded by such tools?  Given the realities of a one-person development team, can a composer's vision sometimes be served by adopting a hybrid approach combining aspects of the ideal flexibility and musicality of the "totally live" real-time approach with substantial "time-deferred," pre-recorded elements? [...]+
  
-;;# +22/02/2016 -- 5:30pm
-=> [[http://www.ircam.fr/139.html?event=1307|Séminaires Recherche et Création de l'IRCAM]]\\ +
-Salle Stravinsky\\ +
-26 mai 2014, 12h-13h +
-;;#+
  
-------+  * J. Bresson, P. Donat-Bouillud: **[[https://ccrma.stanford.edu/events/jean-bresson-and-pierre-donat-bouillud-current-research-composition-software-ircam|Current Research on Composition Software at Ircam]]**.\\ //This talk will be a presentation of two major projects carried out in the Music Representation research group at IRCAM: OpenMusic and Antescofo. OpenMusic is a visual programming language dedicated to music creation and processing. It allows composers to develop musical processes taking the best advantage of the computational, representational and expressive power of a programming language, using a graphical patching interface and musical editors. Antescofo~ is a modular polyphonic Score Following system as well as a Synchronous Programming language for real-time computer music composition and live performance. The score following module allows for automatic recognition of music score position and tempo from a realtime audio streams, making it possible to synchronize an instrumental performance with computer realized elements. The synchronous language, in conjunction with its dedicated graphical editor AscoGraph, allows flexible writing of interactions. We will briefly present these two compositional software and discuss their most recent features and current related research.//
  
-===  Lisp for music technology - Workshop at the European Lisp Symposium ELS'2014 === +\\ 
 +\\
  
-**IRCAM, 6/05/2014**  
  
-{{ :efficace:lambda.png?nolink&80|}}+=== JUCE Summit 2015=== 
 +**London, 19-20 novembre 2015**
  
-The power and expressivity of Lisp make it a valuable language to musicians for exploring high-level compositional processes, and this language is a fundamental support for computer-aided composition research and creation. +/*{{:efficace:juce.png?140 |}}*/
-In this session we will present an overview of our current projects and developments, and discuss the challenges, issues and perspectives for using Lisp in new music technologies such as digital signal processing and real-time systems.+
  
-;;# +  * **Integrating Juce-based GUI in Max/MSP, OpenMusic or other computer music environments.**\\ Thibaut Carpentier
-=> [[http://www.european-lisp-symposium.org/editions/2014/|7th  European Lisp Symposium]]\\ +
-IRCAM, Paris. 5-6 May, 2014 +
-;;#+
  
------+=> [[http://www.juce.com/juce-summit-2015/juce-summit-list-talks-and-sessions#carpentier|Résumé]]\\
  
-=== OpenMusic @ Linux Audio Conference === +\\ 
 +\\ 
 +=== Colloque international "Outils et des méthodes innovantes pour l’enseignement de la musique et du traitement du signal" ===
  
-**ZKM -- Karlsruhe, 3/05/2014** +**Université Jean Monnet, Saint-Etienne2-Nov. 2015**
  
-{{ :efficace:zkm.jpg?nolink&170|}} +  * Dimitri Bouche**De l’esquisse à la composition**
- +
-The Linux version of OpenMusic was developed by Anders Vinjar and the support of the [[http://www.bek.no/|BEK]] center (Bergen, Norway). It embeds parts of the new scheduling and external audio/MIDI rendering systems. The presentation is included in the "Music Programming" session of LAC'2014 to be held at ZKM in Karlsruhe.+
  
 ;;# ;;#
-=> [[http://lac.linuxaudio.org/2014/|Linux Audio Conference LAC'2014]]\\ +=> [[http://portail.univ-st-etienne.fr/bienvenue/recherche/cierec-colloque-international-des-outils-et-des-methodes-innovantes-pour-l-enseignement-de-la-musique-et-du-traitement-du-signal-571218.kjsp|Programme du Colloque]]\\
-ZKM -- Karlsruhe, Germany 1-4 May, 2014+
 ;;# ;;#
  
 +=== Days of research seminars and workshops in Oslo, Norway ====
  
 +{{ :efficace:nmh-logo.png?nolink&260|}}
 +{{:efficace:bevegelseslab-320.jpg?nolink&300 |}}
  
 +**University of Oslo, Department of Musicology\\
 +Norwegian Academy of Music\\
 +24-26/11 2014**
  
 +The EFFICACe project was presented along with Ircam's last research and developments on spatial audio at the Department of Musicology of the University of Oslo during a workshop day arganised at the fourmMs laboratory.
  
-/* +Two dans of courses and workshop on OpenMusic were held under the coordination of prof. Asbjørn Schaathun at the Norwegian Academy of Music. 
-<;hidden  onHidden=&quot;**Barão Geraldo, SP, Brasil, 28/08/2014:\\ Concert: Livre Digital -- Fronteiras Musicais Tecnologias que desafi(n)am os sentidos**" onVisible="**Barão Geraldo, SP, Brasil, 28/08/2014:\\ Concert: Livre Digital -- Fronteiras Musicais Tecnologias que desafi(n)am os sentidos**";>;+ 
 +;;# 
 +=&gt[[http://www.hf.uio.no/imv/english/research/news-and-events/events/guest-lectures-seminars/2014/ircam.html|webpage (IMV, 11/24)]]\\ 
 +;;
 +\\ 
 +\\
 \\ \\
-{{ :efficace:cartaz_concerto.jpg?130|}}\\ 
  
-//Encerrando 1o Colóquio Franco-Brasileiro de Análise e Criação Musicais com Suporte Computacional, Livre Digital será uma noite dedicada à exploração de fronteiras musicais: improvisação fora de qualquer padrão, explorando novas sonoridades e processos. Tecnologias que desafi(n)am os sentidos, algoritmos que dialogam com instrumentistas humanos. Cabe ao espectador desvendar quem é quem. Dilatar essas fronteiras, espreitar e vivenciar os seus limites. Abrir seus ouvidos e deixar-se surpreender.//\\ 
  
-=> Live performance of **Hamiltonian Song (M. Andreatta) with interactive real-time analysis and visualization in OpenMusic**. 
  
-;;# 
-[[http://www.almanaquecafe.com.br/events/livre-digital-fronteiras-musicais/|Almanaque Café]]\\ 
-Barão Geraldo (SP) Brasil\\ 
-28/08/2014 às 21h  
-;;# 
  
-{{ :efficace:concerto-hamiltonian-brasil.jpg?400 |}} 
  
-</hidden> 
-\\ 
-===== ===== 
  
-<hidden  onHidden="**Universities of Campinas and São Paolo, Brasil, 18-29/08, 2014:\\ 1° Colóquio Franco-Brasileiro de Criação e Análise Musicais com Suporte Computacional**" onVisible="**Universities of Campinas and São Paolo, Brasil, 18-29/08, 2014:\\ 1° Colóquio Franco-Brasileiro de Criação e Análise Musicais com Suporte Computacional**"> 
  
-{{ :efficace:coloquiobrasil.jpg?160|}} + 
-\\+=== 1° Colóquio Franco-Brasileiro de Criação e Análise Musicais com Suporte Computacional === 
 +**Universities of Campinas and São Paolo, Brasil, 18-29/08, 2014** 
 + 
 +{{ :efficace:coloquiobrasil.jpg?200|}}
  
 {{:efficace:nics.png?60 |}} //O Núcleo Interdisciplinar de Comunicação Sonora (NICS), em parceria com a Faculdade de Engenharia Elétrica e Computação (FEEC), o Instituto de Artes (IA) e Centro de Centro de Integração, Documentação e Difusão Cultural (CIDDIC) da Unicamp, junto à USP e à UNESP, com apoio da FAPESP realizam o 1° Colóquio Franco‑Brasileiro de Análise e Criação Musicais com Suporte Computacional. {{:efficace:nics.png?60 |}} //O Núcleo Interdisciplinar de Comunicação Sonora (NICS), em parceria com a Faculdade de Engenharia Elétrica e Computação (FEEC), o Instituto de Artes (IA) e Centro de Centro de Integração, Documentação e Difusão Cultural (CIDDIC) da Unicamp, junto à USP e à UNESP, com apoio da FAPESP realizam o 1° Colóquio Franco‑Brasileiro de Análise e Criação Musicais com Suporte Computacional.
Ligne 404: Ligne 335:
 ;;# ;;#
  
-</hidden> 
 \\ \\
-===== ===== 
  
 +\\
 +\\
  
-<hidden  onHidden="**Melbourne, 28/07-1/08 2014:\\ IEEE Symposium on Visual Languages and Human-Centric Computing**" onVisible="**Melbourne, 28/07-1/08 2014:\\ IEEE Symposium on Visual Languages and Human-Centric Computing**">+=== PRISMA Meeting ===
  
-{{ :efficace:IEEE.jpg?80|}} +**IRCAM, Paris, 4-8/07/2015**
-\\+
  
-J. Bresson: **Reactive visual programs for computer-aided music composition**.\\  +  * **Computer-aided composition tools and interfaces for the control of sound spatialization**\\ Jérémie Garcia, Xavier Favory, Jean Bresson
-Paper presented at the IEEE Symposium on Visual Languages and Human-Centric Computing.\\+
  
-Abstract: //We present a new framework for reactive programming in OpenMusic, a visual programming language dedicated to computer-aided music composition. 
-We highlight some characteristics of the programming and computation paradigms, and describe the implementation of a hybrid system merging demand-driven and event-driven evaluation models in this environment.// 
  
-;;# 
-IEEE [[https://sites.google.com/site/vlhcc2014/|VL/HCC 2014]]\\ 
-Melbourne, Australia\\ 
-July 28th, Aug. 1st, 2014. 
-;;# 
-</hidden> 
 \\ \\
-===== ===== 
  
 +=== CCRMA Guest Colloquium Dimitri Bouche : Scheduling & Time Structures in Computer­ Assisted Composition ===
  
-<hidden  onHidden="**Vienna Summer of Logic, 13/07/2014:\\ Rhythm Tree Rewriting**" onVisible="**Vienna Summer of Logic, 13/07/2014:\\   Rhythm Tree Rewriting**">+/*{{ :efficace:karma_logo.png?300|}}*/
  
-{{ :efficace:vsl14.png?nolink&80|}} +**CCRMA (Center for Computer Research in Music and Acoustics)\\ 
-\\+Department of Music, Stanford University.**
  
-Presentation at the [[http://verify.rwth-aachen.de/IFIP-WG1.6/|International Federation for Information Processing (IFIP) Working Group 1.6]] on Term Rewritingby Florent Jacquemard.+**Wed 02/42015**
  
 +//Music composers’ work with computers is generally divided in the two distinct stages of composition and performance. Each stage requires specific methodology and software: computer-­aided composition involves time and high computation capabilities to produce complex musical scores, while performance and live rendering require reactive environments with precise real‐time constraints. Contemporary music composers permanently challenge these kinds of established categories, using unusual objects and behaviors in their music or considering the variations in performance as an actual part of the composition.\\
 +Dimitri's work consists in extending the properties of a computer­ assisted composition software (OpenMusic) to meet these challenges. It mostly means working on defining a scheduling architecture for computer-­assisted composition software, but also on providing high level tools for the composers and efficient display of potential complex and non-­deterministic data. Even if there are software products that embed interesting scheduling capabilities, no general model has been well established.\\
 +After a brief historical survey on the evolution of computer­ assisted composition and computer music systems (and Lisp-based music systems), Dimitri will present his current work and developments.//
  
-**Abstract: Rhythm Tree Rewriting (Florent Jacquemard, Jean Bresson and Pierre Donat-Bouillud)**. In traditional western music notation, the durations of notes are expressed hierarchically by recursive subdivisions. This led naturally to a tree representation of melodies widespread in systems for assistance to music authoring. We will see how term rewriting techniques can be applied to computer music problems, in particular to the problem of rhythm quantization: the transcription of a sequence of dates (real time values) into a music score. Besides the question of rhythm representation, an interesting problem in this context is the design and concise description of large rewrite rule sets and decision problems for these descriptions.  
  
 ;;# ;;#
-=> [[http://vsl2014.at/meetings/|Vienna Summer of Logic]]\\ 9-24 July 2014+=> [[https://ccrma.stanford.edu/events/dimitri-bouche-scheduling-time-structures-in-computer-assisted-composition|CCRMA Event info]]\\
 ;;# ;;#
  
-</hidden> 
-\\ 
-===== ===== 
  
  
  
-<hidden  onHidden="**Goldsmiths University, London, 30/06/2014:\\ Interactive Music Notation and Representation Workshop @NIME'14**" onVisible="**Goldsmiths University, London, 30/06/2014:\\  Interactive Music Notation and Representation Workshop @NIME'14**"> 
  
-{{ :efficace:nime14.png?nolink&amp;80|}} +=== Séminaire &quot;Nouveaux espaces de la notation musicale" === 
-\\+**INA-GRM - Vendredi 6/02/2015**
  
-[[http://www.afim-asso.fr|AFIM]] work-group [[http://notation.afim-asso.org|&quot;Les nouveaux espaces de la notation musicale"]].+/*{{ :efficace:omnotation.png?nolink&amp;120|}}*/
  
-The Interactive Music Notation and Representation Workshop co-located with the [[http://www.nime2014.org/|14th International Conference on New Interfaces for Musical Expression]] gathered artistsresearchers and application developers to compare views and the needs in the field of music notation and representation inspired by contemporary practicesin interactive and live musicincluding representational forms emerging from live coding+Groupe de travail de l'AFIM (Association Française d'Informatique Musicale)\\ 
 +Avec Karim Barkati (Weezic), Dominique Fober (Grame), Robert Piéchaud, Filipe Lopes et Pierre Couprie (IReMus).\\ 
 +=> **Représentations interactives de données musicales dans OpenMusic** (JBresson). 
 + 
 +/* Nous présenterons un système introduisant la notion de réactivité dans les programmes visuels d'OpenMusic. Avec ce systèmedes évènements externes ou des éditions/actions locales de l'utilisateur peuvent entrainer des chaines de réactions dans les programmesconduisant à la mise à jour des structures et éditeurs de données qu'ils contiennentOpenMusic peut alors devenir un environnement interactif de traitement et visualisation de données musicales provenant d'entrées temps-réel. */
  
 ;;# ;;#
-=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-nimew|Notation Workshop @NIME]]\\ +=> [[http://notation.afim-asso.org/doku.php/evenements/2015-02-06-etude-notation3|Page du séminaire]]\\
-NIME'14, Goldsmiths, University of London\\ 30 June 2014, 9h30-13h+
 ;;# ;;#
  
-</hidden> 
-\\ 
-===== ===== 
  
-<hidden  onHidden="**Queen Mary University, London, 30/06/2014:\\ Seminar on Music Notation and Computation @ C4DM**" onVisible="**Queen Mary University, London, 30/06/2014::\\  Seminar on Music Notation and Computation @ C4DM**"> 
  
-{{ :efficace:qmul.png?nolink&150|}} +=== John MacCallum and Teoma NaccaratoHeart rate data from contemporary dancers === 
-\\+**Séminaires Recherche et Création, IRCAM\\ 13/10 2014 (12h-13h)**\\
  
-[[http://www.afim-asso.fr|AFIM]] work-group [[http://notation.afim-asso.org|"Les nouveaux espaces de la notation musicale"]]+The composer John MacCallum and choreographer Teoma Naccarato propose a collaborative project that examines the use of real-time, heart rate data from contemporary dancers to drive a polytemporal composition for instrumental ensemble with live electronics.\\ 
- +In collaboration with the Musical Representations Team as part of the EFFICACe Project.
-Hosted by the [[http://c4dm.eecs.qmul.ac.uk/|Center for Digital Music]] at Queen Mary University, this seminar focused on representations and computational approach for generating, optmizing and processing music notation in computer music systems.+
  
 ;;# ;;#
-=> [[http://notation.afim-asso.org/doku.php/evenements/2014-06-30-qmul|C4DM Notation Seminar]]\\ +[[http://medias.ircam.fr/x1ed4fe|[video]]]
-Center For Digital Music, Queen Mary University of London\\ 30 June 2014, 15h-17h30+
 ;;# ;;#
  
-</hidden> 
-\\ 
-===== ===== 
  
 +=== Rhythm Trees at the Vienna Summer of Logic ===
  
-<hidden  onHidden="**Université d'Avignon13/06/2014:\\ Séminaire Agorantic: Culture, Patrimoines, Sociétés Numériques**" onVisible="**Université d'Avignon, 13/06/2014:\\  Séminaire Agorantic: Culture, Patrimoines, Sociétés Numériques**">+**ViennaAustria, 13/07/2014**
  
-{{ :efficace:agorantic.png?nolink&200|}} +/*{{ :efficace:vsl14.png?nolink&80|}}*/ 
-\\+ 
 +Presentation at the [[http://verify.rwth-aachen.de/IFIP-WG1.6/|International Federation for Information Processing (IFIP) Working Group 1.6]] on Term Rewriting, by Florent Jacquemard. 
 + 
 +**Rhythm Tree Rewriting (Florent Jacquemard, Jean Bresson and Pierre Donat-Bouillud)**. //In traditional western music notation, the durations of notes are expressed hierarchically by recursive subdivisions. This led naturally to a tree representation of melodies widespread in systems for assistance to music authoring. We will see how term rewriting techniques can be applied to computer music problems, in particular to the problem of rhythm quantization: the transcription of a sequence of dates (real time values) into a music score. Besides the question of rhythm representation, an interesting problem in this context is the design and concise description of large rewrite rule sets and decision problems for these descriptions.// 
 + 
 +;;# 
 +=> [[http://vsl2014.at/meetings/|Vienna Summer of Logic]]\\ 9-24 July 2014 
 +;;# 
 +=== Séminaire Agorantic: Culture, Patrimoines, Sociétés Numériques ===  
 + 
 +**Université d'Avignon, 13/06/2014** 
 + 
 +/*{{ :efficace:agorantic.png?nolink&200|}}*/
  
 [[http://blogs.univ-avignon.fr/sfr-agorantic/|Structure Fédérative de Recherche (S)FR Agorantic]] [[http://blogs.univ-avignon.fr/sfr-agorantic/|Structure Fédérative de Recherche (S)FR Agorantic]]
Ligne 496: Ligne 422:
 **Programmation et représentation musicale interactive en composition assistée par ordinateur**\\ J. Bresson (IRCAM - UMR STMS).\\ **Programmation et représentation musicale interactive en composition assistée par ordinateur**\\ J. Bresson (IRCAM - UMR STMS).\\
  
-La recherche en composition assistée par ordinateur menée à l’Ircam vise à lier des techniques et formalismes informatiques aux processus de composition musicale. Au delà du développement d’outils, il s’agit d’accompagner la démarche créatrice du compositeur en lui donnant accès aux ressources de calcul et de représentation offertes par l’informatique dans un objectif exploratoire et artistique. Dans cette perspective, de nombreux langages de programmation dédiés ont été développés et utilisés par les compositeurs, pour produire ou transformer des sons (//digital signal processing//), pour programmer des interactions (systèmes live « temps-réel »), ou pour construire des structures musicales suivant des démarches algorithmiques ou calculatoires (« composition assistée par ordinateur »).  +// 
- +/*La recherche en composition assistée par ordinateur menée à l’Ircam vise à lier des techniques et formalismes informatiques aux processus de composition musicale. Au delà du développement d’outils, il s’agit d’accompagner la démarche créatrice du compositeur en lui donnant accès aux ressources de calcul et de représentation offertes par l’informatique dans un objectif exploratoire et artistique. Dans cette perspective, de nombreux langages de programmation dédiés ont été développés et utilisés par les compositeurs, pour produire ou transformer des sons (//digital signal processing//), pour programmer des interactions (systèmes live « temps-réel »), ou pour construire des structures musicales suivant des démarches algorithmiques ou calculatoires (« composition assistée par ordinateur »).*/ 
-L’utilisation de langages de programmation favorise une vision duale des structures musicales, où les objets sont considérés comme des processus déployés dans le temps, et où inversement les processus deviennent des représentations puissantes et expressives des objets musicaux. Cette vision « programmatique » de la création musicale n’exclut cependant pas les interfaces et interactions utilisateur (compositeur/programmeur), qui constituent un volet primordial des recherches que nous menons actuellement dans ce domaine. Au delà des interactions entre machines et instruments de musique, qui sont de plus en plus fréquentes et riches dans les pratiques musicales contemporaines, il existe également une interaction forte entre le compositeur et les programmes informatiques qu’il produit ou utilise, que ce soit dans la phase de composition, ou même lors de la performance d’une œuvre.  +L’utilisation de langages de programmation pour la composition favorise une vision duale des structures musicales, où les objets sont considérés comme des processus déployés dans le temps, et où inversement les processus deviennent des représentations puissantes et expressives des objets musicaux. Cette vision « programmatique » de la création musicale n’exclut cependant pas les interfaces et interactions utilisateur (compositeur/programmeur), qui constituent un volet primordial des recherches que nous menons actuellement. Au delà des interactions entre machines et instruments de musique, qui sont de plus en plus fréquentes et riches dans les pratiques musicales contemporaines, il existe également une interaction forte entre le compositeur et les programmes informatiques qu’il produit ou utilise, que ce soit dans la phase de composition, ou même lors de la performance d’une œuvre. 
 Les langages visuels favorisent cette interaction et proposent des représentations mêlant les aspects fonctionnels ou procéduraux, et les interventions « manuelles » sur données mises en jeu dans les processus, accessibles sous des formes familières aux musiciens (partitions, formes d’ondes sonores, courbes de contrôle, etc.)  Les langages visuels favorisent cette interaction et proposent des représentations mêlant les aspects fonctionnels ou procéduraux, et les interventions « manuelles » sur données mises en jeu dans les processus, accessibles sous des formes familières aux musiciens (partitions, formes d’ondes sonores, courbes de contrôle, etc.) 
- 
 Nous illustrerons ces différents points avec la présentation de l’environnement OpenMusic en essayant de décrire les spécificités de cet environnement du point de vue des paradigmes musicaux et calculatoires sous-tendus par cette notion de composition assistée par ordinateur, ainsi que nos recherches et développements actuels dans le cadre du projet ANR EFFICACe, centrés sur les aspects interactifs mis en jeu en support des processus créatifs, dans les contextes de création, de performance ou d’analyse musicale. Nous illustrerons ces différents points avec la présentation de l’environnement OpenMusic en essayant de décrire les spécificités de cet environnement du point de vue des paradigmes musicaux et calculatoires sous-tendus par cette notion de composition assistée par ordinateur, ainsi que nos recherches et développements actuels dans le cadre du projet ANR EFFICACe, centrés sur les aspects interactifs mis en jeu en support des processus créatifs, dans les contextes de création, de performance ou d’analyse musicale.
- +//
  
 {{:efficace:documents:bresson-agorantic14-slides.pdf|Supports de la présentation [PDF]}} {{:efficace:documents:bresson-agorantic14-slides.pdf|Supports de la présentation [PDF]}}
Ligne 513: Ligne 436:
 12-13 juin 2014  12-13 juin 2014 
 ;;# ;;#
-</hidden> 
-\\ 
-===== ===== 
  
  
  
  
-<hidden  onHidden="**IRCAM, 26/05/2014:\\ Matthew Schumaker: A Californian Composer's Dive Into Real-Time Technology and Computer-Aided Composition**" onVisible="**IRCAM, 26/05/2014:\\  Matthew Schumaker: A Californian Composer's Dive Into Real-Time Technology and Computer-Aided Composition**"> 
-{{ :efficace:ircam-logo1.png?nolink&100|}} 
-\\ 
  
-Invited talk by **Matthew Schumaker**, PhD candidate in Music Composition at the University of California, Berkeley.\\+=== IRCAM Forum Workshops 2013 - OM 6.7 presentation === 
  
 +**20/11/2013**
  
-Matt Schumaker will present his work with live electronics and computer-aided composition in three recent pieces: //In threatening possibilities// (2012), for two singers, large ensemble and electronics; //Nocte Lux// (2013), for two cellos, bass and electronics; and a work-in-progress for soprano and orchestraThe first two were undertaken at UC Berkeley's Center for New Music and Audio Technologies (CNMAT) and the third is being created during an ongoing Berkeley Ladd fellowship in ParisThis work engages live-electronics with real-time voice and instrument processing using Ircam objects for granular synthesis and a variety of other signal processing Live electronics are employed to create hybrid instrument: a custom-designed, computer-interactive sampler keyboard that performs with continuously changing, physical modelling sound sets created in Modalys.  Computer-aided composition tools in OpenMusic are employed to investigate personal ideas of virtual thematism and interpolation between musical lines.   +/*{{ :efficace:forumnet.png?nolink&160|}}*/ 
-Seen as case studies of a Californian composer using computer music tools and programming environments, these works bring up questions concerning the prevalent composer/programmer paradigm How does a person who is first a composer and only secondarily and of necessity a programmer, engage fruitfully with the vast possibilities and intricacies afforded by such tools?  Given the realities of a one-person development teamcan a composer's vision sometimes be served by adopting a hybrid approach combining aspects of the ideal flexibility and musicality of the "totally live" real-time approach with substantial "time-deferred," pre-recorded elements? [...]+ 
 +[[http://forumnet.ircam.fr/shop/fr/forumnet/43-openmusic.html|OM 6.7]] includes the first prototype of a redesigned scheduling system and audio architectureIt has been ported on [[http://repmus.ircam.fr/openmusic/linux|Linux]] thanks to a collaboration with Anders Vinjar and the [[http://www.bek.no/|BEK]] center (Bergen, Norway) and coupled with several new external audio/MIDI rendering systems such as JackFluidSynth, MPlayer… 
 +The OM-Faust library (by Dimitri Bouche, developed in the framework of the [[http://inedit.ircam.fr/|Inedit]] project), as well as new releases of the OMPrisma and [[http://sourceforge.net/projects/omsox/|OM-SoX]] libraries presented by Marlon Schumacher ([[http://www.cirmmt.mcgill.ca|CIRMMT]] / McGill University, Montréal) are compatible with this new version and controlled by this new architecture.
  
 ;;# ;;#
-=> [[http://www.ircam.fr/139.html?event=1307|Séminaires Recherche et Création de l'IRCAM]]\\ +=> http://forumnet.ircam.fr/fr/ateliers2013/ 
-Salle Stravinsky\\ +/* http://archiprod-externe.ircam.fr/video/VI02049100-564.mp4 */
-26 mai 2014, 12h-13h+
 ;;# ;;#
-</hidden&gt;+ 
 + 
 +=== Meeting at UC Berkeley Center for New Music and Audio Technologies (CNMAT) === 
 + 
 +**CNMAT, Berkeley, 11/09/2013** 
 + 
 + 
 +/*{{ :efficace:cnmat.png?nolink&amp;200|}}*/ 
 + 
 +**The current and future state of computer-aided composition.** In this talk we discussed the fundamentals of computer-aided composition and the computational models present in OpenMusic, that set it apart from reactive, real-time environments such as Max/MSP. We intended to point to some directions for future research in computer-aided composition and to draw concrete objectives for the project.
 \\ \\
-===== ===== 
  
 +Co-sponsored event by UC Berkeley [[http://music.berkeley.edu/|Department of Music]], the [[http://townsendcenter.berkeley.edu/|Townsend Center for Humanities]], and [[http://cnmat.berkeley.edu/|CNMAT]]
  
 +;;#
 +=> http://cnmat.berkeley.edu/event/2013/09/11/open_music_computer_aided_composition_discussion_jean_bresson
 +;;#
  
  
-<;hidden  onHidden=&quot;**IRCAM6/05/2014:\\ Lisp for music technology Workshop at the European Lisp Symposium ELS'2014**" onVisible="**IRCAM6/05/2014:\\ Lisp for music technology Workshop at the European Lisp Symposium ELS'2014**"> + 
-{{ :efficace:lambda.png?nolink&80|}}+ 
 + 
 +====== Conferences ====== 
 + 
 +;#; 
 +**=&gtSee alo the [[.:publi|publications]] page.** 
 +;#; 
 + 
 +=== 10ème Colloque sur la Modélisation des Systèmes Réactifs (MSR 2015) === 
 +**Inria Nancy‐Grand Est 18-20 novembre 2015** 
 + 
 + 
 +  * **Articulation dynamique de structures temporelles pour l’informatique musicale**\\ Dimitri BoucheJean Bresson 
 + 
 +=> [[http://msr2015.loria.fr/|MSR 2015]]\\ 
 + 
 + 
 + 
 +=== inSonic 2015: Aesthetic concepts of spatial audio in sound, music and sound-art === 
 + 
 +{{ :efficace:insonic.png?150|}} 
 + 
 + 
 +27-28 Nov. 2015 
 +ZKM - HfG, Karlsruhe Germany 
 + 
 + **Interactive-Algorithmic Control of Sound Spatialization**\\ Jérémie Garcia, Jean Bresson, Thibaut Carpentier, Marlon Schumacher, Xavier Favory 
 + 
 +;;# 
 +=> [[http://insonic2015.org/|InSonic 2015]]\\ 
 +;;# 
 \\ \\
-The power and expressivity of Lisp make it a valuable language to musicians for exploring high-level compositional processesand this language is a fundamental support for computer-aided composition research and creation. +=== 27ème Conférence Francophone sur l’interaction Homme-Machine (IHM 2015)=== 
-In this session we will present an overview of our current projects and developmentsand discuss the challengesissues and perspectives for using Lisp in new music technologies such as digital signal processing and real-time systems.+ 
 +{{ :efficace:ihm15.png?200|}} 
 + 
 +**Toulouse27-30 octobre 2015** 
 + 
 +  * **Trajectoires : une application mobile pour le contrôle et l’écriture de la spatialisation sonore**\\ Xavier FavoryJérémie GarciaJean Bresson
  
 ;;# ;;#
-=> [[http://www.european-lisp-symposium.org/|7th  European Lisp Symposium]]\\ +=> [[http://ihm2015.afihm.org/programme.html|Programme complet de la conférence IHM 2015]]
-IRCAM, Paris. 5-6 May, 2014+
 ;;# ;;#
-</hidden> 
 \\ \\
-===== ===== 
  
  
 +=== OpenMusic @ Linux Audio Conference === 
 +
 +**ZKM -- Karlsruhe, 3/05/2014** 
  
-<hidden  onHidden="**ZKM -- Karlsruhe, 3/05/2014:\\ OpenMusic @ Linux Audio Conference**" onVisible="**ZKM -- Karlsruhe, 3/05/2014:\\ OpenMusic @ Linux Audio Conference**"> 
 {{ :efficace:zkm.jpg?nolink&170|}} {{ :efficace:zkm.jpg?nolink&170|}}
-\\+
 The Linux version of OpenMusic was developed by Anders Vinjar and the support of the [[http://www.bek.no/|BEK]] center (Bergen, Norway). It embeds parts of the new scheduling and external audio/MIDI rendering systems. The presentation is included in the "Music Programming" session of LAC'2014 to be held at ZKM in Karlsruhe. The Linux version of OpenMusic was developed by Anders Vinjar and the support of the [[http://www.bek.no/|BEK]] center (Bergen, Norway). It embeds parts of the new scheduling and external audio/MIDI rendering systems. The presentation is included in the "Music Programming" session of LAC'2014 to be held at ZKM in Karlsruhe.
  
Ligne 567: Ligne 533:
 ZKM -- Karlsruhe, Germany 1-4 May, 2014 ZKM -- Karlsruhe, Germany 1-4 May, 2014
 ;;# ;;#
-</hidden> 
-\\ 
-*/ 
-\\ 
-\\ 
  
-===== 2013 =====+====== Pedagogical actions ====== 
 +;#; 
 +**Courses and workshops** 
 +;#;
  
 +{{ :efficace:pac.jpg?nolink&200 |}}
  
-{{ :efficace:pac.jpg?nolink&150|}} 
  
-=== IRCAM composition and computer music program, 9/12/2013 === 
  
-This presentation  at the IRCAM [[http://www.ircam.fr/cursus.html|composition and computer music program]] ("Cursus 1") was focused on the concepts of time and computation in computer music systems and introduced a **reactive computation model in OpenMusic**. It was completed with a demonstration by [[https://www.lri.fr/~garcia/|Jeremie Garcia]] on the use of this reactive model in his recent works on paper interfaces for computer music applications.+ 
 +=== CNMAT OpenMusic Workshops (UC Berkeley), 1-15/04/2016 === 
 + 
 +JBressonpractical introduction to the OpenMusic computer-aided composition environment
  
 ;;# ;;#
-=> {{:efficace:documents:presentation-cursus-2013.pdf|Slides of the presentation [PDF]}}+http://cnmat.berkeley.edu/event/2016/04/15/cnmat_openmusic_workshops_jean_bresson_ircam
 ;;# ;;#
  
------ 
-=== Seminar: Time, rhythm and arithmetics === 
  
-**MaMuX seminar series, IRCAM, 6/12/2013** +=== IRCAM Academie20/06/2015 ===
- +
-{{ :efficace:time.png?nolink&200|}} +
- +
-In the framework of the Ircam [[http://repmus.ircam.fr/mamux/|MaMuX]] seminar series we propose this [[http://repmus.ircam.fr/mamux/saisons/saison13-2013-2014/2013-12-06|special session]] dedicated to formal/mathematical theory of time and rhythm in music composition and performance. **Invited speakers**: Philippe Riot, Alain Le Méhauté (Federal University of Kazan, Russia), Jean-Louis Giavitto (IRCAM - CNRS), Karim Haddad (Composer, IRCAM).+
  
 +This 3h presentation by J. Bresson and J. Garcia to the Academie attendies was framed in the //Composition Workshop// of the Manifeste festival 2015. 
 +It was dedicated on drawing and gesture input in computer-aided composition and OpenMusic.
 ;;# ;;#
-=> http://repmus.ircam.fr/mamux/saisons/saison13-2013-2014/2013-12-06+http://manifeste2015.ircam.fr/academie/atelier-de-composition/
 ;;# ;;#
  
-------- 
  
-=== IRCAM Forum Workshops 2013 - OM 6.7 presentation === +=== IRCAM composition and computer music program, 23/02/2015 ===
  
-**20/11/2013**+This 3h presentation to the IRCAM students of the composition and computer music program was focused on some recent developments and applications of the Efficace project : interaction and reactive processes in OpenMusic (J. Bresson), interfaces for the control of sound spatialization (J. Garcia) and guided improvisation (J. Nika). 
 +;;# 
 +http://www.ircam.fr/cursus.html 
 +;;# 
 +=== ATIAM Master program, Ircam UMPC, 2 and 9/12/2014 ===
  
-{{ :efficace:forumnet.png?nolink&;160|}} +These two 3h courses in the Masters program ATIAM (Acoustics, Signal Processing and Computer Science applied to music) were practical introductory sessions to OpenMusic. 
-[[http://forumnet.ircam.fr/shop/fr/forumnet/43-openmusic.html|OM 6.7]] includes the first prototype of a redesigned scheduling system and audio architecture. It has been ported on [[http://repmus.ircam.fr/openmusic/linux|Linux]] thanks to a collaboration with Anders Vinjar and the [[http://www.bek.no/|BEK]] center (Bergen, Norway) and coupled with several new external audio/MIDI rendering systems such as Jack, FluidSynth, MPlayer… +;;# 
-The OM-Faust library (by Dimitri Bouchedeveloped in the framework of the [[http://inedit.ircam.fr/|Inedit]] project), as well as new releases of the OMPrisma and [[http://sourceforge.net/projects/omsox/|OM-SoX]] libraries presented by Marlon Schumacher ([[http://www.cirmmt.mcgill.ca|CIRMMT]] / McGill University, Montréal) are compatible with this new version and controlled by this new architecture.+http://www.atiam.ircam.fr/en
 +;;# 
 +=== CIEE Program3-4/07/2014 ===
  
 +This 12h course was a general introduction to computer-aided composition and OpenMusic.
 +It was given during IRCAM Manifeste 2014 for the "Summer contemporary music creation + critique" program of the CIEE (Council on International Educational Exchange).
 ;;# ;;#
-=> http://forumnet.ircam.fr/fr/ateliers2013/ +http://www.ciee.org/study-abroad/france/paris/summer-contemporary-music-creation-critique/
-/* http://archiprod-externe.ircam.fr/video/VI02049100-564.mp4 */+
 ;;# ;;#
 +=== IRCAM professional training, Paris, 10/03/2014 ===
  
------+The last level of this series of prefessional training was dedicated to programming libraries and integrating foreign code in teh OpenMusic environment. 
 +;;# 
 +http://formations.ircam.fr/shop/en/openmusic/18-openmusic-case-studies.html 
 +;;#
  
-=== Meeting at UC Berkeley Center for New Music and Audio Technologies (CNMAT) === 
  
-**CNMATBerkeley11/09/2013**+=== ACROEGrenoble31/01/2014 === 
 + 
 +This 3h workshop was an introduction of OpenMusic dedicated to the researchers from the [[http://acroe.imag.fr/|ACROE]] research center and the AST (Arts, Sciences & Technology) of PHELMA / INP Grenoble.
  
  
-{{ :efficace:cnmat.png?nolink&200|}} 
  
-**The current and future state of computer-aided composition.** In this talk we discussed the fundamentals of computer-aided composition and the computational models present in OpenMusic, that set it apart from reactive, real-time environments such as Max/MSP. We intended to point to some directions for future research in computer-aided composition and to draw concrete objectives for the project. 
-\\ 
  
-Co-sponsored event by UC Berkeley [[http://music.berkeley.edu/|Department of Music]], the [[http://townsendcenter.berkeley.edu/|Townsend Center for Humanities]], and [[http://cnmat.berkeley.edu/|CNMAT]] 
  
 +=== IRCAM composition and computer music program, 9/12/2013 ===
 +
 +
 +
 +This presentation  at the IRCAM [[http://www.ircam.fr/cursus.html|composition and computer music program]] ("Cursus 1") was focused on the concepts of time and computation in computer music systems and introduced a **reactive computation model in OpenMusic**. It was completed with a demonstration by [[https://www.lri.fr/~garcia/|Jeremie Garcia]] on the use of this reactive model in his recent works on paper interfaces for computer music applications.
 ;;# ;;#
-=> http://cnmat.berkeley.edu/event/2013/09/11/open_music_computer_aided_composition_discussion_jean_bresson+http://www.ircam.fr/cursus.html\\ 
 +=> {{:efficace:documents:presentation-cursus-2013.pdf|Slides of the presentation [PDF]}}
 ;;# ;;#
  
 


efficace/events.1429004000.txt.gz · Dernière modification: 2015/04/14 11:33 par Jean Bresson