Machine Learning of Personal Gesture Variation in Music Conducting
Gesture Elicitation and Interaction
/
Sarasua, Alvaro
/
Caramiaux, Baptiste
/
Tanaka, Atau
Proceedings of the ACM CHI'16 Conference on Human Factors in Computing
Systems
2016-05-07
v.1
p.3428-3432
© Copyright 2016 ACM
Summary: This note presents a system that learns expressive and idiosyncratic gesture
variations for gesture-based interaction. The system is used as an interaction
technique in a music conducting scenario where gesture variations drive music
articulation. A simple model based on Gaussian Mixture Modeling is used to
allow the user to configure the system by providing variation examples. The
system performance and the influence of user musical expertise is evaluated in
a user study, which shows that the model is able to learn idiosyncratic
variations that allow users to control articulation, with better performance
for users with musical expertise.
Human-Centred Machine Learning
Workshop Summaries
/
Gillies, Marco
/
Fiebrink, Rebecca
/
Tanaka, Atau
/
Garcia, Jérémie
/
Bevilacqua, Frédéric
/
Heloir, Alexis
/
Nunnari, Fabrizio
/
Mackay, Wendy
/
Amershi, Saleema
/
Lee, Bongshin
/
d'Alessandro, Nicolas
/
Tilmanne, Joëlle
/
Kulesza, Todd
/
Caramiaux, Baptiste
Extended Abstracts of the ACM CHI'16 Conference on Human Factors in
Computing Systems
2016-05-07
v.2
p.3558-3565
© Copyright 2016 ACM
Summary: Machine learning is one of the most important and successful techniques in
contemporary computer science. It involves the statistical inference of models
(such as classifiers) from data. It is often conceived in a very impersonal
way, with algorithms working autonomously on passively collected data. However,
this viewpoint hides considerable human work of tuning the algorithms,
gathering the data, and even deciding what should be modeled in the first
place. Examining machine learning from a human-centered perspective includes
explicitly recognising this human work, as well as reframing machine learning
workflows based on situated human working practices, and exploring the
co-adaptation of humans and systems. A human-centered understanding of machine
learning in human context can lead not only to more usable machine learning
tools, but to new ways of framing learning computationally. This workshop will
bring together researchers to discuss these issues and suggest future research
questions aimed at creating a human-centered approach to machine learning.
Form Follows Sound: Designing Interactions from Sonic Memories
Speech & Auditory Interfaces
/
Caramiaux, Baptiste
/
Altavilla, Alessandro
/
Pobiner, Scott G.
/
Tanaka, Atau
Proceedings of the ACM CHI'15 Conference on Human Factors in Computing
Systems
2015-04-18
v.1
p.3943-3952
© Copyright 2015 ACM
Summary: Sonic interaction is the continuous relationship between user actions and
sound, mediated by some technology. Because interaction with sound may be task
oriented or experience-based it is important to understand the nature of
action-sound relationships in order to design rich sonic interactions. We
propose a participatory approach to sonic interaction design that first
considers the affordances of sounds in order to imagine embodied interaction,
and based on this, generates interaction models for interaction designers
wishing to work with sound. We describe a series of workshops, called Form
Follows Sound, where participants ideate imagined sonic interactions, and then
realize working interactive sound prototypes. We introduce the Sonic Incident
technique, as a way to recall memorable sound experiences. We identified three
interaction models for sonic interaction design: conducting; manipulating;
substituting. These three interaction models offer interaction designers and
developers a framework on which they can build richer sonic interactions.
Understanding Gesture Expressivity through Muscle Sensing
Special Issue on Physiological Computing for Human-Computer Interaction
/
Caramiaux, Baptiste
/
Donnarumma, Marco
/
Tanaka, Atau
ACM Transactions on Computer-Human Interaction
2015-01
v.21
n.6
p.31
© Copyright 2015 ACM
Summary: Expressivity is a visceral capacity of the human body. To understand what
makes a gesture expressive, we need to consider not only its spatial placement
and orientation but also its dynamics and the mechanisms enacting them. We
start by defining gesture and gesture expressivity, and then we present
fundamental aspects of muscle activity and ways to capture information through
electromyography and mechanomyography. We present pilot studies that inspect
the ability of users to control spatial and temporal variations of 2D shapes
and that use muscle sensing to assess expressive information in gesture
execution beyond space and time. This leads us to the design of a study that
explores the notion of gesture power in terms of control and sensing. Results
give insights to interaction designers to go beyond simplistic gestural
interaction, towards the design of interactions that draw on nuances of
expressive gesture.
Adaptive Gesture Recognition with Variation Estimation for Interactive
Systems
Special Issue on Activity Recognition for Interaction
/
Caramiaux, Baptiste
/
Montecchio, Nicola
/
Tanaka, Atau
/
Bevilacqua, Frédéric
ACM Transactions on Interactive Intelligent Systems
2015-01
v.4
n.4
p.18
© Copyright 2015 ACM
Summary: This article presents a gesture recognition/adaptation system for human --
computer interaction applications that goes beyond activity classification and
that, as a complement to gesture labeling, characterizes the movement
execution. We describe a template-based recognition method that simultaneously
aligns the input gesture to the templates using a Sequential Monte Carlo
inference technique. Contrary to standard template-based methods based on
dynamic programming, such as Dynamic Time Warping, the algorithm has an
adaptation process that tracks gesture variation in real time. The method
continuously updates, during execution of the gesture, the estimated parameters
and recognition results, which offers key advantages for continuous human --
machine interaction. The technique is evaluated in several different ways:
Recognition and early recognition are evaluated on 2D onscreen pen gestures;
adaptation is assessed on synthetic data; and both early recognition and
adaptation are evaluated in a user study involving 3D free-space gestures. The
method is robust to noise, and successfully adapts to parameter variation.
Moreover, it performs recognition as well as or better than nonadapting offline
template-based methods.
Posters
NIME 2014: New Interfaces for Musical Expression
2014-06-30
p.26
© Copyright 2014 Authors
A Gesture Detection with Guitar Pickup and Earphones
+ Suh, Sangwon
+ Lee, Jeong-seob
+ Yeo, Woon Seung
A Max/MSP Approach for Incorporating Digital Music via Laptops in Live Performances of Music Bands
+ Amo, Yehiel
+ Zissu, Gil
+ Eloul, Shaltiel
+ Shlomi, Eran
+ Schukin, Dima
+ Kalifa, Almog
A Real Time Common Chord Progression Guide on the Smartphone for Jamming Pop Song on the Music Keyboard
+ Lui, Simon
An Exploration of Peg Solitaire as a Compositional Tool
+ Keatch, Kirsty
Auraglyph: Handwritten Computer Music Composition and Design
+ Salazar, Spencer
+ Wang, Ge
Body As Instrument: Performing with Gestural Interfaces
+ Mainsbridge, Mary
+ Beilharz, Kirsty
Circle Squared and Circle Keys -- Performing on and with an unstable live algorithm for the Disklavier
+ Dahlstedt, Palle
Composing Embodied Sonic Play Experiences: Towards Acoustic Feedback Ecology
+ van Troyer, Akito
Design & Evaluation of an Accessible Hybrid Violin Platform
+ Overholt, Dan
+ Gelineck, Steven
Dynamical Interactions with Electronic Instruments
+ Mudd, Tom
+ Dalton, Nick
+ Holland, Simon
+ Mulholland, Paul
eMersion | Sensor-controlled Electronic Music Modules & Digital Data Workstation
+ Udell, Chet
+ Sain, James Paul
FingerSynth: Wearable Transducers for Exploring the Environment and Playing Music Everywhere
+ Dublon, Gershon
+ Paradiso, Joseph A.
Hand and Finger Motion-Controlled Audio Mixing Interface
+ Ratcliffe, Jarrod
How to Make Embedded Acoustic Instruments
+ Berdahl, Edgar
Interactive Parallax Scrolling Score Interface for Composed Networked Improvisation
+ Canning, Rob
Mobile Device Percussion Parade
+ Snyder, Jeff
+ Sarwate, Avneesh
+ Chen, Carolyn
+ Fishman, Noah
+ Collins, Quinn
+ Ergun, Cenk
+ Mulshine, Michael
Musical Interface to Audiovisual Corpora of Arbitrary Instruments
+ Neupert, Max
+ Goßmann, Joachim
New Open-Source Interfaces for Group Based Participatory Performance of Live Electronic Music
+ Barraclough, Timothy J
+ Murphy, Jim
+ Kapur, Ajay
Orphion: A gestural multi-touch instrument for the iPad
+ Trump, Sebastian
+ Bullock, Jamie
Pd-L2Ork Raspberry Pi Toolkit as a Comprehensive Arduino Alternative in K-12 and Production Scenarios
+ Bukvic, Ivica
PiaF: A Tool for Augmented Piano Performance Using Gesture Variation Following
+ Van Zandt-Escobar, Alejandro
+ Caramiaux, Baptiste
+ Tanaka, Atau
Pitch Canvas: Touchscreen Based Mobile Music Instrument
+ Strylowski, Bradley
+ Allison, Jesse
Reappropriating Museum Collections: Performing Geology Specimens and Meterology Data as New Instruments for Musical Expression
+ Bowers, John
+ Shaw, Tim
Rub Synth: A Study of Implementing Intentional Physical Difficulty Into Touch Screen Music Controllers
+ Sarier, Ozan
Sound Analyser: A Plug-in for Real-Time Audio Analysis in Live Performances and Installations
+ Stark, Adam
Tangle: a Flexible Framework for Performance with Advanced Robotic Musical Instruments
+ Mathews, Paul
+ Morris, Ness
+ Murphy, Jim
+ Kapur, Ajay
+ Carnegie, Dale
The Politics of Laptop Ensembles
+ Knotts, Shelly
+ Collins, Nick
Muscular Interactions. Combining EMG and MMG sensing for musical practice
Session 2: Multimodal
/
Donnarumma, Marco
/
Caramiaux, Baptiste
/
Tanaka, Atau
NIME 2013: New Interfaces for Musical Expression
2013-05-27
p.4
Keywords: NIME, sensorimotor system, EMG, MMG, biosignal, multimodal, mapping
© Copyright 2013 Authors
Summary: We present the first combined use of the electromyogram (EMG) and
mechanomyogram (MMG), two biosignals that result from muscular activity, for
interactive music applications. We exploit differences between these two
signals, as reported in the biomedical literature, to create bi-modal
sonification and sound synthesis mappings that allow performers to distinguish
the two components in a single complex arm gesture. We study non-expert
players' ability to articulate the different modalities. Results show that
purposely designed gestures and mapping techniques enable novices to rapidly
learn to independently control the two biosignals.
Machine Learning of Musical Gestures
Session 10: Gesture | Space
/
Caramiaux, Baptiste
/
Tanaka, Atau
NIME 2013: New Interfaces for Musical Expression
2013-05-27
p.32
Keywords: Machine Learning, Data mining, Musical Expression, Musical Gestures,
Analysis, Control, Gesture, Sound
© Copyright 2013 Authors
Summary: We present an overview of machine learning (ML) techniques and their
application in interactive music and new digital instrument design. We first
provide the non-specialist reader an introduction to two ML tasks,
classification and regression, that are particularly relevant for gestural
interaction. We then present a review of the literature in current NIME
research that uses ML in musical gesture analysis and gestural sound control.
We describe the ways in which machine learning is useful for creating
expressive musical interaction, and in turn why live music performance presents
a pertinent and challenging use case for machine learning.
Towards Gestural Sonic Affordances
Posters (1)
/
Altavilla, Alessandro
/
Caramiaux, Baptiste
/
Tanaka, Atau
NIME 2013: New Interfaces for Musical Expression
2013-05-27
p.51
Keywords: Gestural embodiment of sound, Affordances, Mapping
© Copyright 2013 Authors
Summary: We present a study that explores the affordance evoked by sound and
sound-gesture mappings. In order to do this, we make use of a sensor system
with minimal form factor in a user study that minimizes cultural association.
The present study focuses on understanding how participants describe sounds and
gestures produced while playing designed sonic interaction mappings. This
approach seeks to move from object-centric affordance towards investigating
embodied gestural sonic affordances.
Muscular Interactions. Combining EMG and MMG sensing for musical practice
Demos (1)
/
Donnarumma, Marco
/
Caramiaux, Baptiste
/
Tanaka, Atau
NIME 2013: New Interfaces for Musical Expression
2013-05-27
p.70
Beyond recognition: using gesture variation for continuous interaction
alt.chi: Design Lessons
/
Caramiaux, Baptiste
/
Bevilacqua, Frederic
/
Tanaka, Atau
Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing
Systems
2013-04-27
v.2
p.2109-2118
© Copyright 2013 ACM
Summary: Gesture-based interaction is widespread in touch screen interfaces. The goal
of this paper is to tap the richness of expressive variation in gesture to
facilitate continuous interaction. We achieve this through novel techniques of
adaptation and estimation of gesture characteristics. We describe two
experiments. The first aims at understanding whether users can control certain
gestural characteristics and if that control depends on gesture vocabulary. The
second study uses a machine learning technique based on particle filtering to
simultaneously recognize and measure variation in a gesture. With this
technology, we create a gestural interface for a playful photo processing
application. From these two studies, we show that 1) multiple characteristics
can be varied independently in slower gestures (Study 1), and 2) users find
gesture-only interaction less pragmatic but more stimulating than traditional
menu-based systems (Study 2).
De-Mo: designing action-sound relationships with the mo interfaces
Interactivity: exploration
/
Bevilacqua, Frédéric
/
Schnell, Norbert
/
Rasamimanana, Nicolas
/
Bloit, Julien
/
Flety, Emmanuel
/
Caramiaux, Baptiste
/
Françoise, Jules
/
Boyer, Eric
Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing
Systems
2013-04-27
v.2
p.2907-2910
© Copyright 2013 ACM
Summary: The Modular Musical Objects (MO) are an ensemble of tangible interfaces and
software modules for creating novel musical instruments or for augmenting
objects with sound. In particular, the MOs allow for designing action-sound
relationships and behaviors based on the interaction with tangible objects or
free body movements.
Such interaction scenarios can be inspired by the affordances of particular
objects (e.g. a ball, a table), by interaction metaphors based on the playing
techniques of musical instruments or games. We describe specific examples of
action-sound relationships that are made possible by the MO software modules
and which take advantage of machine learning techniques.
MubuFunkScatShare: gestural energy and shared interactive music
Interactivity: exploration
/
Tanaka, Atau
/
Caramiaux, Baptiste
/
Schnell, Norbert
Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing
Systems
2013-04-27
v.2
p.2999-3002
© Copyright 2013 ACM
Summary: We present a ludic interactive music performance that allows live recorded
sounds to be re-rendered through the users' movements. The interaction design
made the control similar to a shaker where the motion energy drives the energy
of the played music piece. The instrument has been designed for musicians as
well as non-musicians and allows for multiple players. In the MubuFunkScatShare
performance, one performer plays acoustical instruments into the system,
subsequently rendering them by shaking a smartphone. He invites participation
by volunteers from the audience, resulting in a fun musical piece that includes
layers of funk guitar, scat singing, guitar solo, and beatboxing.
Movement qualities as interaction modality
Designing for the body
/
Alaoui, Sarah Fdili
/
Caramiaux, Baptiste
/
Serrano, Marcos
/
Bevilacqua, Frédéric
Proceedings of DIS'12: Designing Interactive Systems
2012-06-11
p.761-769
© Copyright 2012 ACM
Summary: In this paper, we explore the use of movement qualities as interaction
modality. The notion of movement qualities is widely used in dance practice and
can be understood as how the movement is performed, independently of its
specific trajectory in space. We implemented our approach in the context of an
artistic installation called A light touch. This installation invites the
participant to interact with a moving light spot reacting to the hand movement
qualities. We conducted a user experiment that showed that such an interaction
based on movement qualities tends to enhance the user experience favouring
explorative and expressive usage.
Gestural Embodiment of Environmental Sounds: an Experimental Study
/
Caramiaux, Baptiste
/
Susini, Patrick
/
Bianco, Tommaso
/
Bevilacqua, Frédéric
/
Houix, Olivier
/
Schnell, Norbert
/
Misdariis, Nicolas
NIME 2011: New Interfaces for Musical Expression
2011-05-30
p.144-148
Keywords: Embodiment, Environmental Sound Perception, Listening, Gesture Sound
Interaction
© Copyright 2011 Authors
Summary: In this paper we present an experimental study concerning gestural
embodiment of environmental sounds in a listening context. The presented work
is part of a project aiming at modeling movement-sound relationships, with the
end goal of proposing novel approaches for designing musical instruments and
sounding objects. The experiment is based on sound stimuli corresponding to
"causal" and "non-causal" sounds. It is divided into a performance phase and an
interview. The experiment is designed to investigate possible correlation
between the perception of the "causality" of environmental sounds and different
gesture strategies for the sound embodiment. In analogy with the perception of
the sounds' causality, we propose to distinguish gestures that "mimic" a
sound's cause and gestures that "trace" a sound's morphology following temporal
sound characteristics. Results from the interviews show that, first, our causal
sounds database lead to consistent descriptions of the action at the origin of
the sound and participants mimic this action. Second, non-causal sounds lead to
inconsistent metaphoric descriptions of the sound and participants make
gestures following sound "contours". Quantitatively, the results show that
gesture variability is higher for causal sounds that noncausal sounds.
Sound Selection by Gestures
/
Caramiaux, Baptiste
/
Bevilacqua, Frédéric
/
Schnell, Norbert
NIME 2011: New Interfaces for Musical Expression
2011-05-30
p.329-330
Keywords: Query by Gesture, Time Series Analysis, Sonic Interaction
© Copyright 2011 Authors
Summary: This paper presents a prototypical tool for sound selection driven by users'
gestures. Sound selection by gestures is a particular case of "query by
content" in multimedia databases. Gesture-to-Sound matching is based on
computing the similarity between both gesture and sound parameters' temporal
evolution. The tool presents three algorithms for matching gesture query to
sound target. The system leads to several applications in sound design, virtual
instrument design and interactive installation.
From dance to touch: movement qualities for interaction design
Works-in-progress
/
Alaoui, Sarah Fdili
/
Caramiaux, Baptiste
/
Serrano, Marcos
Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems
2011-05-07
v.2
p.1465-1470
© Copyright 2011 ACM
Summary: In this paper we address the question of extending user experience in large
scale tactile displays. Our contribution is a non task-oriented interaction
technique based on modern dance for the creation of aesthetically pleasant
large scale tactile interfaces. This approach is based on dance movement
qualities applied to touch interaction allowing for natural gestures in large
touch displays. We used specific movements from a choreographic glossary and
developed a robust movement quality recognition process. To illustrate our
approach, we propose a media installation called A light touch, where touch is
used to control a light spot reacting to movement qualities.
Towards a Gesture-Sound Cross-Modal Analysis
Gesture Processing
/
Caramiaux, Baptiste
/
Bevilacqua, Frédéric
/
Schnell, Norbert
GW 2009: Gesture Workshop
2009-02-25
p.158-170
Keywords: Gesture analysis; Gesture-Sound Relationship; Sound Perception; Canonical
Correlation Analysis
© Copyright 2009 Springer-Verlag
Summary: This article reports on the exploration of a method based on canonical
correlation analysis (CCA) for the analysis of the relationship between gesture
and sound in the context of music performance and listening. This method is a
first step in the design of an analysis tool for gesture-sound relationships.
In this exploration we used motion capture data recorded from subjects
performing free hand movements while listening to short sound examples. We
assume that even though the relationship between gesture and sound might be
more complex, at least part of it can be revealed and quantified by linear
multivariate regression applied to the motion capture data and audio
descriptors extracted from the sound examples. After outlining the theoretical
background, the article shows how the method allows for pertinent reasoning
about the relationship between gesture and sound by analysing the data sets
recorded from multiple and individual subjects.