Human-Computer Interactions in Music - Berkeley 2016

UC Berkeley, May 6th & 13th, 2016

This series of meetings and presentations about human-computer interactions applied to music will gather members of research centers from UC Berkeley, IRCAM (France), Goldsmiths University of London (UK), McGill University and Simon Fraser University (Canada) at the Centre for New Music and Audio Technologies (CNMAT) and the Berkeley Institute of Design (BiD).

Friday May 6th

CNMAT main room

CNMAT event

Baptiste Caramiaux (McGill University, Montréal / Ircam, Paris) and Frédéric Bevilacqua (Ircam, Paris)
Movement and Sound Interaction: A Research Overview from Music Performance to Motor Cognition
Abstract: We will present an overview of research on movement and sound interaction reporting on experiments in music performance, sonic interaction design, and motor cognition. We will start by presenting early works on augmented musical instruments developed in collaboration with composers. This research has then led us to explore novel interaction designs with digital sounds through tangible interfaces and participatory methodologies. In this research we advocated for a computational design approach involving physical models, probabilistic models and data-driven machine learning algorithms. Recently we are exploring the use of these methods to analyse motor learning and control in music performance with applications in rehabilitation and pedagogy.
+Guest: David Mellis (BiD, UC Berkeley)
Machine Learning for Makers: Supporting Novice Analysis of Real-Time Sensor Data
Our ESP (Example-based Sensor Predictions) software recognizes patterns in real-time sensor data, like gestures made with an accelerometer or sounds recorded by a microphone. The machine learning algorithms that power this pattern recognition are specified in Arduino-like code, while the recording and tuning of example sensor data is done in an interactive graphical interface. We’re working on building up a library of code examples for different applications so that Arduino users can easily apply machine learning to a broad range of problems.

⇒ Presentations followed by open discussions, lunch and a CNMAT guest lecture:

Marcelo M. Wanderley (IDMIL, McGill University, Montreal)
Interdisciplinary Research on Music, Science and Engineering: Applications to the Design of Digital Musical Instruments
CNMAT main room

Friday May 13th

Jacobs Institute for Design Innovation, room 210

CNMAT event

Jérémie Garcia (Goldsmiths, University of London)
Interactive Tools to Support Music Composition
Abstract: Music composition is a highly creative activity that usually involves a combination of writing on paper, instrumental playing, whether physical or digital and interaction with computer-aided environments. These environments feature advanced computational possibilities but are much more limited in terms of interaction. Designing computer-aided composition tools not only requires focusing on technical aspects such as enhancing quick input of musical scores or improving sound synthesis algorithms, but it also needs to support the creative aspects of music composition. We still need models and tools tailored for the inherent complexity of music composition that could help composers iteratively design and assess their musical ideas. In this presentation, I will describe the work conducted in close collaboration with composers for understanding the composition process and design new interactive tools able to support their creative needs. A first part will focus on interactive paper, a technology that allows to capture notes written with a pen on paper, and a way to extend existing computer-aided composition tools to provide additional space for expression and exploration of musical ideas. A second part will focus on production and manipulation of temporal control data dedicated to sound spatialisation, a technique that create the illusion that sound sources come from various directions in space.
Jules Françoise (Simon Fraser Univerity, Vancouver)
Machine Learning for User-Centered Motion-Sound Interaction Design
Abstract: Designing the relationship between motion and sound is essential to the creation of interactive systems for music, performing arts and somatic practices. This presentation will outline the Mapping-by-Demonstration approach: a conceptual, experiential and computational framework allowing users to craft interactive technologies through embodied demonstrations of personal associations between movement and sound. We will present concepts, models, and application of this framework for interaction sonification, and discuss the potential and challenges of machine learning for user-centered design rather than problem solving..
+Guest: Cumhur Erkut (Aalborg University Copenhagen)
From Ecological Sounding Artifacts Towards Sonic Artifact Ecologies
The discipline of sonic interaction design has been focused on the interaction between a single user and an artifact. This strongly limits one of the fundamental aspects of music as a social and interactive experience. In this talk we propose sonic artifact ecologies as a mean to examine interactions between one or many users with one or many artifacts. Case studies from a recently run workshop on product sound design are examined, relations to running European projects are drawn, and the challenges of designing sonic interaction for networked interactive devices are highlighted.


Jean Bresson (IRCAM UMR STMS)
John MacCallum (CNMAT/BiD, UC Berkeley)

CNMAT (Center for New Music and Audio technologies)
1750 Arch Street Berkeley, CA

Jacobs Institute for Design Innovation
2530 Ridge Rd, Berkeley, CA


efficace/events/workshop-berkeley.txt · Dernière modification: 2017/05/28 00:08 par Jean Bresson