MERCI : Mixed Musical Reality with Creative Instruments

ANR-19-CE33-0010

MERCI is a collaborative research and development project funded by the French National Research Agency (ANR).


Presentation

Improvisation can be seen as a major driving force in human interactions, strategic in every aspect of communication and action. In its highest form, improvisation is a mixture of structured, planned, directed action, and of hardly predictable local decisions and deviations optimizing adaption to the context, expressing in a unique way the creative self, and stimulating the coordination and cooperation between agents. An invaluable observation deck for understanding, modeling and promoting co-creativity in a context of distributed intelligence, Improvisation is an indispensable ability that any cyber-human system should indeed cope with in an expert way. Improvisation is instantiated in its most refined form in music, where the strongest constraints govern the elaboration of highly complex multi-dimensional, multi-scale, multi-agent actions in a cooperative and timely fashion so as to achieve creative social and cultural cooperation.

Continue reading …

Hide …

Hide …

Setting up powerful and realistic human-machine environments for improvisation necessitates to go beyond the mere software engineering of creative agents with audio-signal listening and generating capabilities, such as what has been mostly done until now. The partners, Ircam STMS Lab, EHESS Cams Lab, and HyVibe startup company propose to drastically renew the paradigm of human-machine improvised interaction by bridging the gap between the computing logics of co-creative musical agents and mixed reality setups anchored in the physics of acoustic instruments.

In such setups of “physical interreality” (a mixed reality scheme where the physical world is actively modified by human action), the human subjects will be immersed and engaged in tangible actions where full embodiment in the digital, the physical and the social world will take place thanks to a joint effort gathering experts from a large inter-disciplinary spectrum.

The main objective of this project is to create the scientific and technological conditions for mixed reality musical systems, enabling human-machine improvised interactions, based on the interrelation of creative digital agents and active acoustic control in musical instruments. We call such mixed reality devices Creative Instruments. Functionally integrating creative artificial intelligence and active control of acoustics into the organological heart of the musical instrument, in order to foster plausible physical interreality situations, necessitates the synergy of highly interdisciplinary public and private research, such as brought by the partners. Such progress will be likely to disrupt artistic and social practices, eventually impacting music industry as well as amateur and professional music practices in a powerful way.


MERCI's full Scientific Proposal

MERCI project resources

Readings

Team

  • Gérard Assayag (Ircam, coord)
  • Marc Chemillier (EHESS, partner)
  • Adrien Mamou Mani (Hyvibe, Partner)
  • Jérôme Nika (Ircam)
  • Joakim Borg (Ircam)
  • Mikhail Malt (Ircam)
  • Tristan Carsault (Ircam)
  • Vasiliki Zachari (Ircam)
  • Sylvie Benoit (Ircam)
  • François Beaulieu (Hyvibe)
  • Louis Chouraki (Hyvibe)
  • Matt Volsky (Hyvibe)
  • François Beaulier (Hyvibe)
  • Yuri Prado (CAMS, EHESS)
  • Musiciens associés : Charles Kely Zana-Rotsy, Rémy Fox, Steve Lehman, Fred Maurin, Bernard Lubat, Lucas Lipari-Mayer, Nicolas Crosse Samuel Favre, Orchestre National de Jazz, Ensemble Intercontemporain, Broken Roots, Thomas Dutronc, Michel Haumont, Brooke Brown, Raphaël Imbert, Justin Johnson, Kfir Ochaion,
 


merci/home.txt · Dernière modification: 2022/02/13 18:56 par Gérard Assayag