| Pencil Fields: An Expressive Low-Tech Performance Interface for Analog Synthesis | | BIBAK | PDF | 275 | |
| Palle Dahlstedt | |||
| I present a novel low-tech multidimensional gestural controller, based on
the resistive properties of a 2D field of pencil markings on paper. A set of
movable electrodes (+, -, ground) made from soldered stacks of coins create a
dynamic voltage potential field in the carbon layer, and another set of movable
electrodes tap voltages from this field. These voltages are used to control
complex sound engines in an analogue modular synthesizer. Both the voltage
field and the tap electrodes can be moved freely. The design was inspired by
previous research in complex mappings for advanced digital instruments, and
provides a similarly dynamic playing environment for analogue synthesis. The
interface is cheap to build, and provides flexible control over a large set of
parameters. It is musically satisfying to play, and allows for a wide range of
playing techniques, from wild exploration to subtle expressions. I also present
an inventory of the available playing techniques, motivated by the interface
design, musically, conceptually and theatrically. The performance aspects of
the interface are also discussed. The interface has been used in a number of
performances in Sweden and Japan in 2011, and is also used by other musicians. Keywords: gestural interface, 2d, analog synthesis, performance, improvisation | |||
| Left and right-hand guitar playing techniques detection | | BIBAK | PDF | 213 | |
| Loïc Reboursière; Otso Lähdeoja; Thomas Drugman; Stéphane Dupont; Cécile Picard-Limpens; Nicolas Riche | |||
| In this paper we present a series of algorithms developed to detect the
following guitar playing techniques: bend, hammer-on, pull-off, slide, palm
muting and harmonic. Detection of playing techniques can be used to control
external content (i.e audio loops and effects, videos, light events, etc.), as
well as to write real-time score or to assist guitar novices in their learning
process. The guitar used is a Godin Multiac with an under-saddle RMC hexaphonic
piezo pickup (one pickup per string, i.e six mono signals). Keywords: Guitar audio analysis, playing techniques, hexaphonic pickup, controller,
augmented guitar | |||
| Temporal Control In the EyeHarp Gaze-Controlled Musical Interface | | BIBAK | PDF | 215 | |
| Zacharias Vamvakousis; Rafael Ramirez | |||
| In this paper we describe the EyeHarp, a new gaze-controlled musical
instrument, and the new features we recently added to its design. In
particular, we report on the EyeHarp new controls, the arpeggiator, the new
remote eye-tracking device, and the EyeHarp capacity to act as a MIDI
controller for any VST plugin virtual instrument. We conducted an evaluation of
the EyeHarp Temporal accuracy by monitoring 10 users while performing a melody
task, and comparing their gaze control accuracy with their accuracy using a
computer keyboard. We report on the results of the evaluation. Keywords: Eye-tracking systems, music interfaces, gaze interaction | |||
| Investigation of Gesture Controlled Articulatory Vocal Synthesizer using a Bio-Mechanical Mapping Layer | | BIBAK | PDF | 291 | |
| Johnty Wang; Nicolas d'Alessandro; Sidney Fels; Robert Pritchard | |||
| We have added a dynamic bio-mechanical mapping layer that contains a model
of the human vocal tract with tongue muscle activations as input and tract
geometry as output to a real time gesture controlled voice synthesizer system
used for musical performance and speech research. Using this mapping layer, we
conducted user studies comparing controlling the model muscle activations using
a 2D set of force sensors with a position controlled kinematic input space that
maps directly to the sound. Preliminary user evaluation suggests that it was
more difficult to using force input but the resultant output sound was more
intelligible and natural compared to the kinematic controller. This result
shows that force input is a potentially feasible for browsing through a vowel
space for an articulatory voice synthesis system, although further evaluation
is required. Keywords: Gesture, Mapping, Articulatory, Speech, Singing, Synthesis | |||
| Further Developments in the Electromagnetically Sustained Rhodes Piano | | BIBAK | PDF | 284 | |
| Greg Shear; Matthew Wright | |||
| The Electromagnetically Sustained Rhodes Piano is an original Rhodes Piano
modified to provide control over the amplitude envelope of individual notes
through aftertouch pressure. Although there are many opportunities to shape the
amplitude envelope before loudspeaker amplification, they are all governed by
the ever-decaying physical vibrations of the tone generating mechanism. A
single-note proof of concept for electromagnetic control over this vibrating
mechanism was presented at NIME 2011. In the past year, virtually every aspect
of the system has been improved. We use a different vibration sensor that is
immune to electromagnetic interference, thus eliminating troublesome feedback.
For control, we both reduce cost and gain continuous position sensing
throughout the entire range of key motion in addition to aftertouch pressure.
Finally, the entire system now fits within the space constraints presented by
the original piano, allowing it to be installed on adjacent notes. Keywords: Rhodes, piano, mechanical synthesizer, electromagnetic, sustain, feedback | |||
| Augmenting human-human interaction in mobile group improvisation | | BIBAK | PDF | 108 | |
| Roberto Pugliese; Koray Tahiroglu; Callum Goddard; James Nesfield | |||
| In this paper strategies for augmenting the social dimension of
collaborative music making, in particular in the form of bodily and situated
interaction are presented. Mobile instruments are extended by means of
relational descriptors democratically controlled by the group and mapped to
sound parameters. A qualitative evaluation approach is described and a user
test with participants playing in groups of three conducted. The results of the
analysis show core-categories such as familiarity with instrument and
situation, shift of focus in activity, family of interactions and different
categories of the experience emerging from the interviews. Our evaluation shows
the suitability of our approach but also the need for iterating on our design
on the basis of the perspectives brought forth by the users. This latter
observation confirms the importance of conducting a thorough interview session
followed by data analysis on the line of grounded theory. Keywords: Collaborative music making, evaluation methods, mobile music, human-human
interaction | |||
| The EMvibe: An Electromagnetically Actuated Vibraphone | | BIBAK | PDF | 101 | |
| N. Cameron Britt; Jeff Snyder; Andrew McPherson | |||
| The EMvibe is an augmented vibraphone that allows for continuous control
over the amplitude and spectrum of individual notes. The system uses
electromagnetic actuators to induce vibrations in the vibraphone's aluminum
tone bars. The tone bars and the electromagnetic actuators are coupled via
neodymium magnets affixed to each bar. The acoustic properties of the
vibraphone allowed us to develop a very simple, low-cost and powerful
amplification solution that requires no heat sinking. The physical design is
meant to be portable and robust, and the system can be easily installed on any
vibraphone without interfering with normal performance techniques. The system
supports multiple interfacing solutions, affording the performer and composer
the ability to interact with the EMvibe in different ways depending on the
musical context. Keywords: Vibraphone, augmented instrument, electromagnetic actuation | |||
| Musical Interaction with Hand Posture and Orientation: A Toolbox of Gestural Control Mechanisms | | BIBAK | PDF | 272 | |
| Thomas Mitchell; Sebastian Madgwick | |||
| This paper presents a toolbox of gestural control mechanisms which are
available when the input sensing apparatus is a pair of data gloves fitted with
orientation sensors. The toolbox was developed in advance of a live music
performance in which the mapping from gestural input to audio output was to be
developed rapidly in collaboration with the performer. The paper begins with an
introduction to the associated literature before introducing a range of
continuous, discrete and combined control mechanisms, enabling a flexible range
of mappings to be explored and modified easily. An application of the toolbox
within a live music performance is then described with an evaluation of the
system with ideas for future developments. Keywords: Computer Music, Gestural Control, Data Gloves | |||
| Digito: A Fine-Grain Gesturally Controlled Virtual Musical Instrument | | BIBAK | PDF | 248 | |
| Nicholas Gillian; Joseph A. Paradiso | |||
| This paper presents Digito, a gesturally controlled virtual musical
instrument. Digito is controlled through a number of intricate hand gestures,
providing both discrete and continuous control of Digito's sound engine; with
the fine-grain hand gestures captured by a 3D depth sensor and recognized using
computer vision and machine learning algorithms. We describe the design and
initial iterative development of Digito, the hand and finger tracking
algorithms and gesture recognition algorithms that drive the system, and report
the insights gained during the initial development cycles and user testing of
this gesturally controlled virtual musical instrument. Keywords: Gesture Recognition, Virtual Musical Instrument | |||
| Voicon: An Interactive Gestural Microphone For Vocal Performance | | BIBAK | PDF | 199 | |
| Yongki Park; Hoon Heo; Kyogu Lee | |||
| This paper describes an interactive gestural microphone for vocal
performance named Voicon. Voicon is a non-invasive and gesture-sensitive
microphone which allows vocal performers to use natural gestures to create
vocal augmentations and modifications by using embedded sensors in a
microphone. Through vocal augmentation and modulation, the performers can
easily generate desired amount of the vibrato and achieve wider vocal range.
These vocal enhancements will deliberately enrich the vocal performance both in
its expressiveness and the dynamics. Using Voicon, singers can generate
additional vibrato, control the pitch and activate customizable vocal effect by
simple and intuitive gestures in live and recording context. Keywords: Gesture, Microphone, Vocal Performance, Performance Interface | |||
| Towards fast multi-point force and hit detection in tabletops using mechanically intercoupled force sensing resistors | | BIBAK | PDF | 257 | |
| Mathieu Bosi; Sergi Jordà | |||
| Tangible tabletop musical interfaces allowing for a collaborative real-time
interaction in live music performances are one of the promising fields in
NIMEs. At present, this kind of interfaces present at least some of the
following characteristics that limit their musical use: latency in the
interaction, and partial or complete lack of responsiveness to gestures such as
tapping, scrubbing or pressing force. Our current research is exploring ways of
improving the quality of interaction with this kind of interfaces, and in
particular with the tangible tabletop instrument Reactable. In this paper we
present a system based on a circular array of mechanically intercoupled force
sensing resistors used to obtain a low-latency, affordable, and easily
embeddable hardware system able to detect surface impacts and pressures on the
tabletop perimeter. We also consider the option of completing this detected
gestural information with the sound information coming from a contact
microphone attached to the mechanical coupling layer, to control physical
modelling synthesis of percussion instruments. Keywords: tangible tabletop interfaces, force sensing resistor, mechanical coupling,
fast low-noise analog to digital conversion, low-latency sensing, micro
controller, multimodal systems, complementary sensing | |||
| TouchKeys: Capacitive Multi-Touch Sensing on a Physical Keyboard | | BIBAK | PDF | 195 | |
| Andrew McPherson | |||
| Capacitive touch sensing is increasingly used in musical controllers,
particularly those based on multi-touch screen interfaces. However, in contrast
to the venerable piano-style keyboard, touch screen controllers lack the
tactile feedback many performers find crucial. This paper presents an
augmentation system for acoustic and electronic keyboards in which multi-touch
capacitive sensors are added to the surface of each key. Each key records the
position of fingers on the surface, and by combining this data with MIDI note
onsets and aftertouch from the host keyboard, the system functions as a
multidimensional polyphonic controller for a wide variety of synthesis
software. The paper will discuss general capacitive touch sensor design,
keyboard-specific implementation strategies, and the development of a flexible
mapping engine using OSC and MIDI. Keywords: augmented instruments, keyboard, capacitive sensing, multitouch | |||
| Wicked Problems and Design Considerations in Composing for Laptop Orchestra | | BIBAK | PDF | 259 | |
| Luke Dahl | |||
| Composing music for ensembles of computer-based instruments, such as laptop
orchestra or mobile phone orchestra, is a multi-faceted and challenging
endeavor whose parameters and criteria for success are ill-defined. In the
design community, tasks with these qualities are known as wicked problems. This
paper frames composing for computer-based ensemble as a design task, shows how
Buchanan's four domains of design are present in the task, and discusses its
wicked properties. The themes of visibility, risk, and embodiment, as
formulated by Klemmer, are shown to be implicitly present in this design task.
Composers are encouraged to address them explicitly and to take advantage of
the practices of prototyping and iteration. Keywords: Design, laptop orchestra, mobile phone orchestra, instrument design,
interaction design, composition | |||
| Collaborative composition and socially constituted instruments: Ensemble laptop performance through the lens of ethnography | | BIBAK | PDF | 136 | |
| Graham Booth; Michael Gurevich | |||
| In this paper, we argue that the design of New Interfaces for Musical
Expression has much to gain from the study of interaction in ensemble laptop
performance contexts using ethnographic techniques. Inspired by recent
third-stream research in the field of human computer interaction, we describe a
recent ethnomethodologically-informed study of the Birmingham Laptop Ensemble
(BiLE), and detail our approach to thick description of the group's working
practices. Initial formal analysis of this material sheds light on the fluidity
of composer, performer and designer roles within the ensemble and shows how
confluences of these roles constitute member's differing viewpoints. We go on
to draw out a number of strands of interaction that highlight the essentially
complex, socially constructed and value driven nature of the group's practice
and conclude by reviewing the implications of these factors on the design of
software tools for laptop ensembles. Keywords: Laptop Performance, Ethnography, Ethnomethodology, Human Computer
Interaction | |||
| Unsupervised Play: Machine Learning Toolkit for Max | | BIBAK | PDF | 68 | |
| Benjamin D. Smith; Guy E. Garnett | |||
| Machine learning models are useful and attractive tools for the interactive
computer musician, enabling a breadth of interfaces and instruments. With
current consumer hardware it becomes possible to run advanced machine learning
algorithms in demanding performance situations, yet expertise remains a
prominent entry barrier for most would-be users. Currently available
implementations predominantly employ supervised machine learning techniques,
while the adaptive, self-organizing capabilities of unsupervised models are not
generally available. We present a free, new toolbox of unsupervised machine
learning algorithms implemented in Max 5 to support real-time interactive music
and video, aimed at the non-expert computer artist. Keywords: NIME, unsupervised machine learning, adaptive resonance theory,
self-organizing maps, Max 5 | |||
| Exploring Reinforcement Learning for Mobile Percussive Collaboration | | BIBAK | PDF | 241 | |
| Nate Derbinsky; Georg Essl | |||
| This paper presents a system for mobile percussive collaboration. We show
that reinforcement learning can incrementally learn percussive beat patterns
played by humans and supports real-time collaborative performance in the
absence of one or more performers. This work leverages an existing integration
between urMus and Soar and addresses multiple challenges involved in the
deployment of machine-learning algorithms for mobile music expression,
including tradeoff between learning speed & quality; interface design for human
collaborators; and real-time performance and improvisation. Keywords: Mobile music, machine learning, cognitive architecture | |||
| Liveness and Flow in Notation Use | | BIBAK | PDF | 217 | |
| Chris Nash; Alan Blackwell | |||
| This paper presents concepts, models, and empirical findings relating to
liveness and flow in the user experience of systems mediated by notation.
Results from an extensive two-year field study of over 1,000 sequencer and
tracker users, combining interaction logging, user surveys, and a video study,
are used to illustrate the properties of notations and interfaces that
facilitate greater immersion in musical activities and domains, borrowing
concepts from programming to illustrate the role of visual and musical
feedback, from the notation and domain respectively. The Cognitive Dimensions
of Notations framework and Csikszentmihalyi's flow theory are combined to
demonstrate how non-realtime, notation-mediated interaction can support
focused, immersive, energetic, and intrinsically-rewarding musical experiences,
and to what extent they are supported in the interfaces of music production
software. Users are shown to maintain liveness through a rapid, iterative
edit-audition cycle that integrates audio and visual feedback. Keywords: notation, composition, liveness, flow, feedback, sequencers, DAWs,
soundtracking, performance, user studies, programming | |||
| Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation | | BIBAK | PDF | 180 | |
| Alexis Clay; Nadine Couture; Myriam Desainte-Catherine; Pierre-Henri Vulliard; Joseph Larralde; Elodie Decarsin | |||
| The augmented ballet project aims at gathering research from several fields
and directing them towards a same application case: adding virtual elements
(visual and acoustic) to a dance live performance, and allowing the dancer to
interact with them. In this paper, we describe a novel interaction that we used
in the frame of this project: using the dancer's movements to recognize the
emotions he expresses, and use these emotions to generate musical audio flows
evolving in real-time. The originality of this interaction is threefold. First,
it covers the whole interaction cycle from the input (the dancer's movements)
to the output (the generated music). Second, this interaction isn't direct but
goes through a high level of abstraction: dancer's emotional expression is
recognized and is the source of music generation. Third, this interaction has
been designed and validated through constant collaboration with a
choreographer, culminating in an augmented ballet performance in front of a
live audience. Keywords: Interactive sonification, motion, gesture and music, interaction, live
performance, musical human-computer interaction | |||
| Comparing Motion Data from an iPod Touch to a High-End Optical Infrared Marker-Based Motion Capture System | | BIBA | PDF | 198 | |
| Kristian Nymoen; Arve Voldsund; Alexander Refsum Jensenius; Ståle A. Skogstad; Jim Torresen | |||
| The paper presents an analysis of the quality of motion data from an iPod Touch (4th gen.). Acceleration and orientation data derived from internal sensors of an iPod is compared to data from a high end optical infrared marker-based motion capture system (Qualisys) in terms of latency, jitter, accuracy and precision. We identify some rotational drift in the iPod, and some time lag between the two systems. Still, the iPod motion data is quite reliable, especially for describing relative motion over a short period of time. | |||
| massMobile -- an Audience Participation Framework | | BIBAK | PDF | 128 | |
| Nathan Weitzner; Jason Freeman; Stephen Garrett; Yan-Ling Chen | |||
| massMobile is a client-server system for mass audience participation in live
performances using smartphones. It was designed to flexibly adapt to a variety
of participatory performance needs and to a variety of performance venues. It
allows for real time bi-directional communication between performers and
audiences utilizing existing wireless 3G, 4G, or WiFi networks. In this paper,
we discuss the goals, design, and implementation of the framework, and we
describe several projects realized with massMobile. Keywords: audience participation, network music, smartphone, performance, mobile | |||
| AuRal: A Mobile Interactive System for Geo-Locative Audio Synthesis | | BIBAK | PDF | 301 | |
| Jesse Allison; Christian Dell | |||
| Aural -- of or relating to the ear or hearing
Aura -- an invisible breath, emanation, or radiation AR -- Augmented Reality AuRal is an environmental audio system in which individual participants form ad hoc ensembles based on geolocation and act the overall sound of the music associated with the location that they are in. The AuRal environment binds physical location and the choices of multiple, simultaneous performers to act as the generative force of music tied to the region. Through a mobile device interface, musical participants, or agents, have a degree of input into the generated music essentially defining the sound of a given region. The audio landscape is superimposed onto the physical one. The resultant musical experience is not tied simply to the passage of time, but through the incorporation of participants over time and spatial proximity, it becomes an aural location as much as a piece of music. As a result, walking through the same location at different times results in unique collaborative listening experiences. Keywords: AuRal, sonic environment, distributed performance system, mobile music,
android, ruby on rails, supercollider | |||
| Extracting Human Expression For Interactive Composition with the Augmented Violin | | BIBAK | PDF | 279 | |
| Mari Kimura; Nicolas Rasamimanana; Frédéric Bevilacqua; Norbert Schnell; Bruno Zamborlin; Emmanuel Fléty | |||
| As a 2010 Artist in Residence in Musical Research at IRCAM, Mari Kimura used
the Augmented Violin to develop new compositional approaches, and new ways of
creating interactive performances [1]. She contributed her empirical and
historical knowledge of violin bowing technique, working with the Real Time
Musical Interactions Team at IRCAM. Thanks to this residency, her ongoing
long-distance collaboration with the team since 2007 dramatically accelerated,
and led to solving several compositional and calibration issues of the Gesture
Follower (GF) [2]. Kimura was also the first artist to develop projects between
the two teams at IRCAM, using OMAX (Musical Representation Team) with GF. In
the past year, the performance with Augmented Violin has been expanded in
larger scale interactive audio/visual projects as well. In this paper, we
report on the various techniques developed for the Augmented Violin and
compositions by Kimura using them, offering specific examples and scores. Keywords: Augmented Violin, Gesture Follower, Interactive Performance | |||
| A Quantitative Comparison of Position Trackers for the Development of a Touch-less Musical Interface | | BIBAK | PDF | 155 | |
| Gabriel Vigliensoni; Marcelo M. Wanderley | |||
| This paper presents a comparison of three-dimensional (3D) position tracking
systems in terms of some of their performance parameters such as static
accuracy and precision, update rate, and shape of the space they sense. The
underlying concepts and characteristics of position tracking technologies are
reviewed, and four position tracking systems (Vicon, Polhemus, Kinect, and
Gametrak), based on different technologies, are empirically compared according
to their performance parameters and technical specifications. Our results show
that, overall, the Vicon was the position tracker with the best performance. Keywords: Position tracker, comparison, touch-less, gestural control | |||
| SABRe: The Augmented Bass Clarinet | | BIBAK | PDF | 193 | |
| Sébastien Schiesser; Jan C. Schacher | |||
| An augmented bass clarinet is developed in order to extend the performance
and composition potential of the instrument. Four groups of sensors are added:
key positions, inertial movement, mouth pressure and trigger switches. The
instrument communicates wirelessly with a receiver setup which produces an OSC
data stream, usable by any application on a host computer.
The SABRe projects intention is to be neither tied to its inventors nor to one single player but to offer a reference design for a larger community of bass clarinet players and composers. For this purpose, several instruments are made available and a number of composer residencies, workshops, presentations and concerts are organized. These serve for evaluation and improvement purposes in order to build a robust and user friendly extended musical instrument, that opens new playing modalities. Keywords: augmented instrument, bass clarinet, sensors, air pressure, gesture, OSC | |||
| Direct and surrogate sensing for the Gyil African xylophone | | BIBAK | PDF | 222 | |
| Shawn Trail; Tiago Fernandes Tavares; Dan Godlovitch; George Tzanetakis | |||
| The Gyil is a pentatonic African wooden xylophone with 14-15 keys. The work
described in this paper has been motivated by three applications: computer
analysis of Gyil performance, live improvised electro-acoustic music
incorporating the Gyil, and hybrid sampling and physical modeling. In all three
of these cases, detailed information about what is played on the Gyil needs to
be digitally captured in real-time. We describe a direct sensing apparatus that
can be used to achieve this. It is based on contact microphones and is informed
by the specific characteristics of the Gyil. An alternative approach based on
indirect acquisition is to apply polyphonic transcription on the signal
acquired by a microphone without requiring the instrument to be modified. The
direct sensing apparatus we have developed can be used to acquire ground truth
for evaluating different approaches to polyphonic transcription and help create
a surrogate" sensor. Some initial results comparing different strategies to
polyphonic transcription are presented. Keywords: hyperinstruments, indirect acquisition, surrogate sensors, computational
ethnomusicology, physical modeling, performance analysis | |||
| Musical Interaction Design with the CUI32Stem: Wireless Options and the GROVE system for prototyping new interfaces | | BIBAK | PDF | 194 | |
| Dan Overholt | |||
| The Create USB Interface is an open source microcontroller board that can be
programmed in C, BASIC, or Arduino languages. The latest version is called the
CUI32Stem, and it is designed to work 'hand-in-hand' with the GROVE prototyping
system that includes a wide range of sensors and actuators. It utilizes a
high-performance Microchip® PIC32 microcontroller unit to allow
programmable user interfaces. Its development and typical uses are described,
focusing on musical interaction design scenarios. Several options for wireless
connectivity are described as well, enabling the CUI32Stem to pair with a
smartphone and/or a normal computer. Finally, SeeedStudio's GROVE system is
explained, which provides a prototyping system comprised of various elements
that incorporate simple plugs, allowing the CUI32Stem to easily connect to the
growing collection of open source GROVE transducers. Keywords: Musical Interaction Design, NIME education, Microcontroller, Arduino
language, StickOS BASIC, Open Sound Control, Microchip PIC32, Wireless,
ZigFlea, WiFi, 802.11g, Bluetooth, CUI32, CUI32Stem | |||
| The JD-1: an Implementation of a Hybrid Keyboard/Sequencer Controller for Analog Synthesizers | | BIBAK | PDF | 187 | |
| Jeff Snyder; Andrew McPherson | |||
| This paper presents the JD-1, a digital controller for analog modular
synthesizers. The JD-1 features a capacitive touch-sensing keyboard that
responds to continuous variations in finger contact, high-accuracy polyphonic
control-voltage outputs, a built-in sequencer, and digital interfaces for
connection to MIDI and OSC devices. Design goals include interoperability with
a wide range of synthesizers, very high-resolution pitch control, and intuitive
control of the sequencer from the keyboard. Keywords: keyboard, sequencer, analog synthesizer, capacitive touch sensing | |||
| Music for Sleeping & Waking Minds | | BIBAK | PDF | 229 | |
| Gascia Ouzounian; R. Benjamin Knapp; Eric Lyon; Luke DuBois | |||
| Music for Sleeping & Waking Minds (2011-2012) is a new, overnight work in
which four performers fall asleep while wearing custom designed EEG sensors
which monitor their brainwave activity. The data gathered from the EEG sensors
is applied in real time to different audio and image signal processing
functions, resulting in continuously evolving multichannel sound environment
and visual projection. This material serves as an audiovisual description of
the individual and collective neurophysiological state of the ensemble.
Audiences are invited to experience the work in different states of attention:
while alert and asleep, resting and awakening. Keywords: EEG, sleep, dream, biosignals, bio art, consciousness, BCI | |||
| Techniques and Circuits for Electromagnetic Instrument Actuation | | BIBAK | PDF | 117 | |
| Andrew McPherson | |||
| There is growing interest in the field of augmented musical instruments,
which extend traditional acoustic instruments using new sensors and actuators.
Several designs use electromagnetic actuation to induce vibrations in the
acoustic mechanism, manipulating the traditional sound of the instrument
without external speakers. This paper presents techniques and guidelines for
the use of electromagnetic actuation in augmented instruments, including
actuator design and selection, interfacing with the instrument, and circuits
for driving the actuators. The material in this paper forms the basis of the
magnetic resonator piano, an electromagnetically-augmented acoustic grand piano
now in its second design iteration. In addition to discussing applications to
the piano, this paper aims to provide a toolbox to accelerate the design of new
hybrid acoustic-electronic instruments. Keywords: augmented instruments, electromagnetic actuation, circuit design, hardware | |||
| OMaxist Dialectics: Capturing, Visualizing and Expanding Improvisations | | BIBAK | PDF | 87 | |
| Benjamin Levy; Georges Bloch; Gerard Assayag | |||
| OMax is an improvisation software based on a graph representation encoding
the pattern repetitions and structures of a sequence, built incrementally and
in real-time from a live Midi or Audio source. We present in this paper a
totally rewritten version of the software. The new design leads to refine the
spectral listening of OMax and to consider different methods to build the
symbolic alphabet labeling our symbolic units. The very modular and versatile
architecture makes possible new musical configurations and we tried the
software with different styles and musical situations. A novel visualization is
proposed, which displays the current state of the learnt knowledge and allows
to notice, both on the fly and a posteriori, points of musical interest and
higher level structures. Keywords: OMax, Improvisation, Machine Learning, Machine Listening, Visualization,
Sequence Model, Software Architecture | |||
| An Electronic Bagpipe Chanter for Automatic Recognition of Highland Piping Ornamentation | | BIBAK | PDF | 200 | |
| Duncan Menzies; Andrew McPherson | |||
| The Highland piping tradition requires the performer to learn and accurately
reproduce a diverse array of ornaments, which can be a daunting prospect to the
novice piper. This paper presents a system which analyses a player's technique
using sensor data obtained from an electronic bagpipe chanter interface.
Automatic recognition of a broad range of piping embellishments allows
real-time visual feedback to be generated, enabling the learner to ensure that
they are practicing each movement correctly. The electronic chanter employs a
robust and responsive infrared (IR) sensing strategy, and uses audio samples
from acoustic recordings to produce a high quality bagpipe sound. Moreover, the
continuous nature of the IR sensors offers the controller a considerable degree
of flexibility, indicating significant potential for the inclusion of extended
and novel techniques for musical expression in the future. Keywords: Great Highland Bagpipe, continuous infrared sensors, ornament recognition,
practice tool, SuperCollider, OSC | |||
| Towards Speeding Audio EQ Interface Building with Transfer Learning | | BIBAK | PDF | 74 | |
| Bryan Pardo; David Little; Darren Gergle | |||
| Potential users of audio production software, such as parametric audio
equalizers, may be discouraged by the complexity of the interface. A new
approach creates a personalized on-screen slider that lets the user manipulate
the audio in terms of a descriptive term (e.g. "warm"), without the user
needing to learn or use the interface of an equalizer. This system learns
mappings by presenting a sequence of sounds to the user and correlating the
gain in each frequency band with the user's preference rating. The system
speeds learning through transfer learning. Results on a study of 35
participants show how an effective, personalized audio manipulation tool can be
automatically built after only three ratings from the user. Keywords: Human computer interaction, music, multimedia production, transfer learning | |||
| Better Drumming Through Calibration: Techniques for Pre-Performance Robotic Percussion Optimization | | BIBAK | PDF | 100 | |
| Jim Murphy; Ajay Kapur; Dale Carnegie | |||
| A problem with many contemporary musical robotic percussion systems lies in
the fact that solenoids fail to respond linearly to linear increases in input
velocity. This nonlinearity forces performers to individually tailor their
compositions to specific robotic drummers. To address this problem, we
introduce a method of pre-performance calibration using metaheuristic search
techniques. A variety of such techniques are introduced and evaluated and the
results of the optimized solenoid-based percussion systems are presented and
compared with output from non-calibrated systems. Keywords: musical robotics, human-robot interaction | |||
| An Interface for Emotional Expression in Audio-Visuals | | BIBAK | PDF | 60 | |
| Kamer Ali Yuksel; Sinan Buyukbas; Elif Ayiter | |||
| In this work, a comprehensive study is performed on the relationship between
audio, visual and emotion by applying the principles of cognitive emotion
theory into digital creation. The study is driven by an audiovisual emotion
library project that is named AVIEM, which provides an interactive interface
for experimentation and evaluation of the perception and creation processes of
audiovisuals. AVIEM primarily consists of separate audio and visual libraries
and grows with user contribution as users explore different combinations
between them. The library provides a wide range of experimentation
possibilities by allowing users to create audiovisual relations and logging
their emotional responses through its interface. Besides being a resourceful
tool of experimentation, AVIEM aims to become a source of inspiration, where
digitally created abstract virtual environments and soundscapes can elicit
target emotions at a preconscious level, by building genuine audiovisual
relations that would engage the viewer on a strong emotional stage. Lastly,
various schemes are proposed to visualize information extracted through AVIEM,
to improve the navigation and designate the trends and dependencies among
audiovisual relations. Keywords: Designing emotive audiovisuals, cognitive emotion theory, audiovisual
perception and interaction, synaesthesia | |||
| Play-A-Grill: Music To Your Teeth | | BIBAK | PDF | 48 | |
| Aisen Caro Chacin | |||
| This paper is an in depth exploration of the fashion object and device, the
Play-A-Grill. It details inspirations, socio-cultural implications, technical
function and operation, and potential applications for the Play-A-Grill system. Keywords: Digital Music Players, Hip Hop, Rap, Music Fashion, Grills, Mouth Jewelry,
Mouth Controllers, and Bone Conduction Hearing | |||
| Interactive Mobile Music Performance with Digital Compass | | BIBAK | PDF | 170 | |
| Bongjun Kim; Woon Seung Yeo | |||
| In this paper we introduce an interactive mobile music performance system
using the digital compass of mobile phones. Compass-based interface can detect
the aiming orientation of performers on stage, allowing us to obtain
information on interactions between performers and use it for both musical
mappings and visualizations on screen for the audience. We document and discuss
the result of a compass-based mobile music performance, Where Are You Standing,
and present an algorithm for a new app to track down the performers' positions
in real-time. Keywords: Mobile music, mobile phone, smartphone, compass, magnetometer, aiming
gesture, musical mapping, musical sonification | |||
| Multiple Pianolas in Antheil's Ballet mécanique | | BIBAK | PDF | 25 | |
| Paul Lehrman | |||
| George Antheil's notorious Ballet mécanique (1924-1925) was
originally scored for percussion ensemble, sound effects, and 16 pianolas. He
was never able to perform the piece with those forces, however, due to his
inability to synchronize multiple pianolas. Thus all performances of the piece
in his lifetime, and for decades after, were done with a single pianola or
player piano.*
The author traces the origin of the concept of synchronizing multiple pianolas, and explains the attendant technological issues. He examines attempts to synchronize mechanical pianos and other time-based devices at the time of Ballet mécanique's composition, and suggests that Antheil's vision for his piece was not as farfetched as has long been thought. Keywords: Antheil, Stravinsky, player piano, pianola, mechanical instruments,
synchronization | |||
| A Component-Based Approach for Modeling Plucked-Guitar Excitation Signals | | BIBAK | PDF | 63 | |
| Raymond Migneco; Youngmoo Kim | |||
| Platforms for mobile computing and gesture recognition provide enticing
interfaces for creative expression on virtual musical instruments. However,
sound synthesis on these systems is often limited to sample-based synthesizers,
which limits their expressive capabilities. Source-filter models are adept for
such interfaces since they provide flexible, algorithmic sound synthesis,
especially in the case of the guitar. In this paper, we present a data-driven
approach for modeling guitar excitation signals using principal components
derived from a corpus of excitation signals. Using these components as
features, we apply nonlinear principal components analysis to derive a feature
space that describes the expressive attributes characteristic to our corpus.
Finally, we propose using the reduced dimensionality space as a control
interface for an expressive guitar synthesizer. Keywords: Source-filter models, musical instrument synthesis, PCA, touch musical
interfaces | |||
| Graphic Score Grammars for End-Users | | BIBAK | PDF | 77 | |
| Alistair G. Stead; Alan F. Blackwell; Samual Aaron | |||
| We describe a system that allows non-programmers to specify the grammar for
a novel graphic score notation of their own design, defining performance
notations suitable for drawing in live situations on a surface such as a
whiteboard. The score can be interpreted via the camera of a smartphone,
interactively scanned over the whiteboard to control the parameters of
synthesisers implemented in Overtone. The visual grammar of the score, and its
correspondence to the sound parameters, can be defined by the user with a
simple visual condition-action language. This language can be edited on the
touchscreen of an Android phone, allowing the grammar to be modified live in
performance situations. Interactive scanning of the score is visible to the
audience as a performance interface, with a colour classifier and visual
feature recogniser causing the grammar-specified events to be sent using OSC
messages via Wi-Fi from the hand-held smartphone to an audio workstation. Keywords: Graphic Notation, Disposable Notation, Live Coding, Computer Vision, Mobile
Music | |||
| Mapping to musical actions in the FILTER system | | BIBAK | PDF | 235 | |
| Doug Van Nort; Jonas Braasch; Pauline Oliveros | |||
| In this paper we discuss aspects of our work in developing performance
systems that are geared towards human machine co-performance with a particular
emphasis on improvisation. We present one particular system, FILTER, which was
created in the context of a larger project related to artificial intelligence
and performance, and has been tested in the context of our electro-acoustic
performance trio. We discuss how this timbrally rich and highly non-idiomatic
musical context has challenged the design of the system, with particular
emphasis on the mapping of machine listening parameters to higher-level
behaviors of the system in such a way that spontaneity and creativity are
encouraged while maintaining a sense of novel dialogue. Keywords: Electroacoustic Improvisation, Machine Learning, Mapping, Sonic Gestures,
Spatialization | |||
| Musician Assistance and Score Distribution (MASD) | | BIBAK | PDF | 237 | |
| Nathan Magnus; David Gerhard | |||
| The purpose of the Musician Assistance and Score Distribution (MASD) system
is to assist novice musicians with playing in an orchestra, concert band, choir
or other musical ensemble. MASD helps novice musicians in three ways. It
removes the confusion that results from page turns, aides a musician's return
to the proper location in the music score after the looking at the conductor
and notifies musicians of conductor instructions. MASD is currently verified by
evaluating the time between sending beats or conductor information and this
information being rendered for the musician. Future work includes user testing
of this system.
There are three major components to the MASD system. These components are Score Distribution, Score Rendering and Information Distribution. Score Distribution passes score information to clients and is facilitated by the Internet Communication Engine (ICE). Score Rendering uses the GUIDO Library to display the musical score. Information Distribution uses ICE and the IceStorm service to pass beat and instruction information to musicians. Keywords: score distribution, score-following, score rendering, musician assistance | |||
| A Design Approach to Engage with Audience with Wearable Musical Instruments: Sound Gloves | | BIBAK | PDF | 197 | |
| Chi-Hsia Lai; Koray Tahiroglu | |||
| This paper addresses the issue of engaging the audience with new musical
instruments in live performance context. We introduce design concerns that we
consider influential to enhance the communication flow between the audience and
the performer. We also propose and put in practice a design approach that
considers the use of performance space as a way to engage with the audience. A
collaborative project, Sound Gloves, presented here exemplifies such a concept
by dissolving the space between performers and audience. Our approach resulted
in a continuous interaction between audience and performers, in which the
social dynamics was changed in a positive way in a live performance context of
NIMEs. Such an approach, we argue, may be considered as one way to further
engage and interact with the audience. Keywords: NIME, wearable electronics, performance, design approach | |||
| A New Keyboard-Based, Sensor-Augmented Instrument For Live Performance | | BIBAK | PDF | 211 | |
| Red Wierenga | |||
| In an attempt to utilize the expert pianist's technique and spare bandwidth,
a new keyboard-based instrument augmented by sensors suggested by the
examination of existing acoustic instruments is introduced. The complete
instrument includes a keyboard, various pedals and knee levers, several bowing
controllers, and breath and embouchure sensors connected to an Arduino
microcontroller that sends sensor data to a laptop running Max/MSP, where
custom software maps the data to synthesis algorithms. The audio is output to a
digital amplifier powering a transducer mounted on a resonator box to which
several of the sensors are attached. Careful sensor selection and mapping help
to facilitate performance mode. Keywords: Gesture, controllers, Digital Musical Instrument, keyboard | |||
| Virtual Pottery: An Interactive Audio-Visual Installation | | BIBAK | PDF | 216 | |
| Yoon Chung Han; Byeong-jun Han | |||
| Virtual Pottery is an interactive audiovisual piece that uses hand gesture
to create 3D pottery objects and sound shape. Using the OptiTrack motion
capture (Rigid Body) system at TransLab in UCSB, performers can take a glove
with attached trackers, move the hand in x, y, and z axis and create their own
sound pieces. Performers can also manipulate their pottery pieces in real time
and change arrangement on the musical score interface in order to create a
continuous musical composition. In this paper we address the relationship
between body, sound and 3D shapes. We also describe the origin of Virtual
Pottery, its design process, discuss its aesthetic value and musical sound
synthesis system, and evaluate the overall experience. Keywords: Virtual Pottery, virtual musical instrument, sound synthesis, motion and
gesture, pottery, motion perception, interactive sound installation | |||
| A Survey and Thematic Analysis Approach as Input to the Design of Mobile Music GUIs | | BIBAK | PDF | 240 | |
| Atau Tanaka; Adam Parkinson; Zack Settel; Koray Tahiroglu | |||
| Mobile devices represent a growing research field within NIME, and a growing
area for commercial music software. They present unique design challenges and
opportunities, which are yet to be fully explored and exploited. In this paper,
we propose using a survey method combined with qualitative analysis to
investigate the way in which people use mobiles musically. We subsequently
present as an area of future research our own PDplayer, which provides a
completely self contained end application in the mobile device, potentially
making the mobile a more viable and expressive tool for musicians. Keywords: NIME, Mobile Music, Pure Data | |||
| Ecological considerations for participatory design of DMIs | | BIBAK | PDF | 253 | |
| A. Cavan Fyans; Adnan Marquez-Borbon; Paul Stapleton; Michael Gurevich | |||
| A study is presented examining the participatory design of digital musical
interactions. The study takes into consideration the entire ecology of digital
musical interactions including the designer, performer and spectator. A new
instrument is developed through iterative participatory design involving a
group of performers. Across the study the evolution of creative practice and
skill development in an emerging community of practice is examined and a
spectator study addresses the cognition of performance and the perception of
skill with the instrument. Observations are presented regarding the cognition
of a novel interaction and evolving notions of skill. The design process of
digital musical interactions is reflected on focusing on involvement of the
spectator in design contexts. Keywords: participatory design, DMIs, skill, cognition, spectator | |||
| Sensor Based Measurements of Musicians' Synchronization Issues | | BIBAK | PDF | 256 | |
| Tobias Grosshauser; Victor Candia; Horst Hildebrand; Gerhard Tröster | |||
| From a technical point of view, instrumental music making involves audible,
visible and hidden playing parameters. Hidden parameters like force, pressure
and fast movements, happening within milliseconds are particularly Diemocult to
capture. Here, we present data focusing on movement coordination parameters of
the left hand fingers with the bow hand in violinists and between two
violinists in group playing. Data was recorded with different position sensors,
a micro camcorder fixed on a violin and an acceleration sensor placed on the
bow. Sensor measurements were obtained at a high sampling rate, gathering the
data with a small microcontroller unit, connected with a laptop computer. To
capture bow's position, rotation and angle directly on the bow to string
contact point, the micro camcorder was fixed near the bridge. Main focuses of
interest were the changes of the left hand finger, the temporal synchronization
between left hand fingers with the right hand, the close up view to the bow to
string contact point and the contact of the left hand finger and/or string to
the fingerboard. Seven violinists, from beginners to master class students
played scales in different rhythms, speeds and bowings and music excerpts of
free choice while being recorded. One measurement with 2 violinists was made to
see the time differences between two musicians while playing together. For
simple integration of a conventional violin into electronic music environments,
left hand sensor data were exemplary converted to MIDI and OSC. Keywords: Strings, violin, coordination, left, finger, right, hand | |||
| Gest-O: Performer gestures used to expand the sounds of the saxophone | | BIBAK | PDF | 262 | |
| Jonh Melo; Daniel Gómez; Miguel Vargas | |||
| This paper describes the conceptualization and development of an open source
tool for controlling the sound of a saxophone via the gestures of its
performer. The motivation behind this work is the need for easy access tools to
explore, compose and perform electroacoustic music in Colombian music schools
and conservatories. This work led to the adaptation of common hardware to be
used as a sensor attached to an acoustic instrument and the development of
software applications to record, visualize and map performers gesture data into
signal processing parameters. The scope of this work suggested that focus was
to be made on a specific instrument so the saxophone was chosen. Gestures were
selected in an iterative process with the performer, although a more ambitious
strategy to figure out main gestures of an instruments performance was first
defined. Detailed gesture-to-sound processing mappings are exposed in the text.
An electroacoustic musical piece was successfully rehearsed and recorded using
the Gest-O system. Keywords: Electroacoustic music, saxophone, expanded instrument, gesture | |||
| The Human Skin as an Interface for Musical Expression | | BIBAK | PDF | 177 | |
| Alexander Müller; Jochen Fuchs | |||
| This paper discusses the utilization of human skin as a tangible interface
for musical expression and collaborative performance. We present an overview of
existing different instrument designs that include the skin as the main input.
As a further development of a previous exploration [16] we outline the setup
and interaction methods of 'Skintimacy', an instrument that appropriates the
skin for low voltage power transmission in multi-player interaction.
Observations deriving from proof-of-concept exploration and performances using
the instrument are brought into the reflection and discussion concerning the
capabilities and limitations of skin as an input surface. Keywords: Skin-based instruments, skin conductivity, collaborative interfaces,
embodiment, intimacy, multi-player performance | |||
| Making Sound Synthesis Accessible for Children | | BIBAK | PDF | 181 | |
| Christoph Trappe | |||
| In this paper we present our project to make sound synthesis and music
controller construction accessible to children in a technology design workshop.
We present the work we have carried out to develop a graphical user interface,
and give account of the workshop we conducted in collaboration with a local
primary school. Our results indicate that the production of audio events by
means of digital synthesis and algorithmic composition provides a rich and
interesting field to be discovered for pedagogical workshops taking a
Constructionist approach. Keywords: Child Computer Interaction, Constructionism, Sound and Music Computing,
Human-Computer Interface Design, Music Composition and Generation, Interactive
Audio Systems, Technology Design Activities | |||
| Developing the Dance Jockey System for Musical Interaction with the Xsens MVN Suit | | BIBA | PDF | 182 | |
| Ståle A. Skogstad; Kristian Nymoen; Yago de Quay; Alexander Refsum Jensenius | |||
| In this paper we present the Dance Jockey System, a system developed for using a full body inertial motion capture suit (Xsens MVN) in music/dance performances. We present different strategies for extracting relevant postures and actions from the continuous data, and how these postures and actions can be used to control sonic and musical features. The system has been used in several public performances, and we believe it has great potential for further exploration. However, to overcome the current practical and technical challenges when working with the system, it is important to further refine tools and software in order to facilitate making of new performance pieces. | |||
| Introducing CrossMapper: Another Tool for Mapping Musical Control Parameters | | BIBAK | PDF | 189 | |
| Liam O'Sullivan; Dermot Furlong; Frank Boland | |||
| Development of new musical interfaces often requires experimentation with
the mapping of available controller inputs to output parameters. Useful
mappings for a particular application may be complex in nature, with one or
more inputs being linked to one or more outputs. Existing development
environments are commonly used to program such mappings, while code libraries
provide powerful data-stream manipulation. However, room exists for a
standalone application with a simpler graphical user interface for dynamically
patching between inputs and outputs. This paper presents an early prototype
version of a software tool that allows the user to route control signals in
real time, using various messaging formats. It is cross-platform and runs as a
standalone application in desktop and Android OS versions. The latter allows
the users of mobile devices to experiment with mapping signals to and from
physical computing components using the inbuilt multi-touch screen. Potential
uses therefore include real-time mapping during performance in a more
expressive manner than facilitated by existing tools. Keywords: Mapping, Software Tools, Android | |||
| Music for Flesh II: informing interactive music performance with the viscerality of the body system | | BIBAK | PDF | 133 | |
| Marco Donnarumma | |||
| Performing music with a computer and loudspeakers represents always a
challenge. The lack of a traditional instrument requires the performer to study
idiomatic strategies by which musicianship becomes apparent. On the other hand,
the audience needs to decode those strategies, so to achieve an understanding
and appreciation of the music being played. The issue is particularly relevant
to the performance of music that results from the mediation between biological
signals of the human body and physical performance.
The present article tackles this concern by demonstrating a new model of musical performance; what I define biophysical music. This is music generated and played in real time by amplifying and processing the acoustic sound of a performer's muscle contractions. The model relies on an original and open source technology made of custom biosensors and a related software framework. The succesfull application of these tools is discussed in the practical context of a solo piece for sensors, laptop and loudspeakers. Eventually, the compositional strategies that characterize the piece are discussed along with a systematic description of the relevant mapping techniques and their sonic outcome. Keywords: Muscle sounds, biophysical music, augmented body, real-time performance,
human-computer interaction, embodiment | |||
| Simpletones: A System of Collaborative Physical Controllers for Novices | | BIBAK | PDF | 258 | |
| Francisco Zamorano | |||
| This paper introduces Simpletones, an interactive sound system that enables
a sense of musical collaboration for non-musicians. Participants can easily
create simple sound compositions in real time by collaboratively operating
physical artifacts as sound controllers. The physical configuration of the
artifacts requires coordinated actions between participants to control sound
(thus requiring, and emphasizing collaboration).
Simpletones encourages playful human-to-human interaction by introducing a simple interface and a set of basic rules [1]. This enables novices to focus on the collaborative aspects of making music as a group (such as synchronization and taking collective decisions through non-verbal communication) to ultimately engage a state of group flow[2]. This project is relevant to a contemporary discourse on musical expression because it allows novices to experience the social aspects of group music making, something that is usually reserved only for trained performers [3]. Keywords: Collaboration, Artifacts, Computer Vision, Color Tracking, State of Flow | |||
| SONIK SPRING | | BIBAK | PDF | 20 | |
| Tomas Henriques | |||
| The Sonik Spring is a portable and wireless digital instrument, created for
real-time synthesis and control of sound. It brings together different types of
sensory input, linking gestural motion and kinesthetic feedback to the
production of sound.
The interface consists of a 15-inch spring with unique flexibility, which allows multiple degrees of variation in its shape and length. The design of the instrument is described and its features discussed. Three performance modes are detailed highlighting the instrument's expressive potential and wide range of functionality. Keywords: Interface for sound and music, Gestural control of sound, Kinesthetic and
visual feedback | |||
| DrumTop: Playing with Everyday Objects | | BIBAK | PDF | 70 | |
| Akito van Troyer | |||
| We introduce a prototype of a new tangible step sequencer that transforms
everyday objects into percussive musical instruments. DrumTop adapts our
everyday task-oriented hand gestures with everyday objects as the basis of
musical interaction, resulting in an easily graspable musical interface for
musical novices. The sound, tactile, and visual feedback comes directly from
everyday objects as the players program drum patterns and rearrange the objects
on the tabletop interface. DrumTop encourages the players to explore the
musical potentiality of their surroundings and be musically creative through
rhythmic interactions with everyday objects. The interface consists of
transducers that trigger a hit, causing the objects themselves to produce sound
when they are in close contact with the transducers. We discuss how we designed
and implemented our current DrumTop prototype and describe how players interact
with the interface. We then highlight the players' experience with DrumTop and
our plans for future work in the fields of music education and performance. Keywords: Tangible User Interfaces, Playful Experience, Percussion, Step Sequencer,
Transducers, Everyday Objects | |||
| The 'Interactive Music Awareness Program' (IMAP) for Cochlear Implant Users | | BIBAK | PDF | 109 | |
| Benjamin R. Oliver; Rachel M. van Besouw; David R. Nicholls | |||
| There is some evidence that structured training can benefit cochlear implant
(CI) users' appraisal of music as well as their music perception abilities.
There are currently very limited music training resources available for CI
users to explore. This demonstration will introduce delegates to the
'Interactive Music Awareness Program' (IMAP) for cochlear implant users, which
was developed in response to the need for a client-centered, structured,
interactive, creative, open-ended, educational and challenging music
(re)habilitation resource. Keywords: music, cochlear implants, perception, rehabilitation, auditory training,
interactive learning, client-centred software | |||
| The Sound Space as Musical Instrument: Playing Corpus-Based Concatenative Synthesis | | BIBAK | PDF | 120 | |
| Diemo Schwarz | |||
| Corpus-based concatenative synthesis is a fairly recent sound synthesis
method, based on descriptor analysis of any number of existing or live-recorded
sounds, and synthesis by selection of sound segments from the database matching
given sound characteristics. It is well described in the literature, but has
been rarely examined for its capacity as a new interface for musical
expression. The interesting outcome of such an examination is that the actual
instrument is the space of sound characteristics, through which the performer
navigates with gestures captured by various input devices. We will take a look
at different types of interaction modes and controllers (positional, inertial,
audio analysis) and the gestures they afford, and provide a critical assessment
of their musical and expressive capabilities, based on several years of musical
experience, performing with the CataRT system for real-time CBCS. Keywords: CataRT, corpus-based concatenative synthesis, gesture | |||
| SenSynth: a Mobile Application for Dynamic Sensor to Sound Mapping | | BIBAK | PDF | 149 | |
| Ryan McGee; Daniel Ashbrook; Sean White | |||
| SenSynth is an open-source mobile application that allows for arbitrary,
dynamic mapping between several sensors and sound synthesis parameters. In
addition to synthesis techniques commonly found on mobile devices, SenSynth
includes a scanned synthesis source for the audification of sensor data. Using
SenSynth, we present a novel instrument based on the audification of
accelerometer data and introduce a new means of mobile synthesis control via a
wearable magnetic ring. SenSynth also employs a global pitch quantizer so one
may adjust the level of virtuosity required to play any instruments created via
mapping. Keywords: mobile music, sonification, audification, mobile sensors | |||
| The electrumpet, additions and revisions | | BIBAK | PDF | 271 | |
| Hans Leeuw | |||
| This short paper follows an earlier NIME paper [1] describing the invention
and construction of the Electrumpet. Revisions and playing experience are both
part of the current paper. The Electrumpet can be heard in the performance
given by Hans Leeuw and Diemo Schwarz at this NIME conference. Keywords: NIME, Electrumpet, live-electronics, hybrid instruments | |||
| Borderlands -- An Audiovisual Interface for Granular Synthesis | | BIBAK | PDF | 152 | |
| Chris Carlson; Ge Wang | |||
| Borderlands is a new interface for composing and performing with granular
synthesis. The software enables flexible, real-time improvisation and is
designed to allow users to engage with sonic material on a fundamental level,
breaking free of traditional paradigms for interaction with this technique. The
user is envisioned as an organizer of sound, simultaneously assuming the roles
of curator, performer, and listener. This paper places the software within the
context of painterly interfaces and describes the user interaction design and
synthesis methodology. Keywords: Granular synthesis, painterly interfaces, improvisation, organized sound,
NIME, CCRMA | |||
| A Voice Interface for Sound Generators: adaptive and automatic mapping of gestures to sound | | BIBAK | PDF | 57 | |
| Stefano Fasciani; Lonce Wyse | |||
| Sound generators and synthesis engines expose a large set of parameters,
allowing run-time timbre morphing and exploration of sonic space. However,
control over these high-dimensional interfaces is constrained by the physical
limitations of performers. In this paper we propose the exploitation of vocal
gesture as an extension or alternative to traditional physical controllers. The
approach uses dynamic aspects of vocal sound to control variations in the
timbre of the synthesized sound. The mapping from vocal to synthesis parameters
is automatically adapted to information extracted from vocal examples as well
as to the relationship between parameters and timbre within the synthesizer.
The mapping strategy aims to maximize the breadth of the explorable perceptual
sonic space over a set of the synthesizer's real-valued parameters, indirectly
driven by the voice-controlled interface. Keywords: Voice Control, Adaptive Interface, Automatic Mapping, Timbre Morphing, Sonic
Space Exploration | |||
| The Dual-Analog Gamepad as a Practical Platform for Live Electronics Instrument and Interface Design | | BIBAK | PDF | 73 | |
| Christopher Ariza | |||
| This paper demonstrates the practical benefits and performance opportunities
of using the dual-analog gamepad as a controller for real-time live
electronics. Numerous diverse instruments and interfaces, as well as detailed
control mappings, are described. Approaches to instrument and preset switching
are also presented. While all of the instrument implementations presented are
made available through the Martingale Pd library, resources for other synthesis
languages are also described. Keywords: Controllers, live electronics, dual-analog, gamepad, joystick, computer
music, instrument, interface | |||
| MuDI -- Multimedia Digital Instrument for Composing and Performing Digital Music for Films in Real-time | | BIBAK | PDF | 64 | |
| Pedro Patrício | |||
| This article proposes a wireless handheld multimedia digital instrument,
which allows one to compose and perform digital music for films in real-time.
Not only does it allow the performer and the audience to follow the film images
in question, but also the relationship between the gestures performed and the
sound generated. Furthermore, it allows one to have an effective control over
the sound, and consequently achieve great musical expression. In addition, a
method for calibrating the multimedia digital instrument, devised to overcome
the lack of a reliable reference point of the accelerometer and a process to
obtain a video score are presented. This instrument has been used in a number
of concerts (Portugal and Brazil) so as to test its robustness. Keywords: Digital musical instrument, mobile music performance, real-time musical
composition, digital sound synthesis | |||
| The body as mediator of music in the Emotion Light | | BIBAK | PDF | 167 | |
| Adinda Rosa van 't Klooster | |||
| This paper describes the development of the Emotion Light, an interactive
biofeedback artwork where the user listens to a piece of electronic music
whilst holding a semi-transparent sculpture that tracks his/her bodily
responses and translates these into changing light patterns that emerge from
the sculpture. The context of this work is briefly described and the questions
it poses are derived from interviews held with audience members. Keywords: Interactive biofeedback artwork, music and emotion, novel interfaces,
practice based research, bodily response, heart rate, biosignals, affective
computing, aesthetic interaction, mediating body, biology inspired system | |||
| Studying Aesthetics in a Musical Interface Design Process Through 'Aesthetic Experience Prism' | | BIBAK | PDF | 226 | |
| Matti Luhtala; Ilkka Niemeläinen; Johan Plomp; Markku Turunen; Julius Tuomisto | |||
| This paper introduces 'The Aesthetic Experience Prism', a framework for
studying how components of aesthetic experience materialize in the model's of
interaction of novel musical interfaces as well as how the role of aesthetics
could be made more explicit in the processes of designing interaction for
musical technologies. The Aesthetic Experience Prism makes use of Arthur
Danto's framework of aesthetic experience that consists of three conceptual
entities: (1) metaphor; (2) expression; and (3) style. In this paper we present
key questions driving the research, theoretical background, artistic research
approach and user research activities.
In the DIYSE project a proof-of-concept music creation system prototype was developed in a collaborative design setting. The prototype provides means to the performer to create music with minimum effort while allowing for versatile interaction. We argue that by using an artistic research approach specifically targeting designing for aesthetic experience we were able to transform the knowledge from early design ideas to resulting technology products in which model's of interaction metaphors, expression and style are in an apparent role. Keywords: Aesthetics, Interaction Design, Artistic Research, Exploration | |||
| Sinkapater -- An Untethered Beat Sequencer | | BIBAK | PDF | 308 | |
| Jiffer Harriman | |||
| This paper provides an overview of a new method for approaching beat
sequencing. As we have come to know them drum machines provide means to loop
rhythmic patterns over a certain interval. Usually with the option to specify
different beat divisions. What I developed and propose for consideration is a
rethinking of the traditional drum machine confines. The Sinkapater is an
untethered beat sequencer in that the beat division, and the loop length can be
arbitrarily modified for each track. The result is the capability to create
complex syncopated patterns which evolve over time as different tracks follow
their own loop rate. To keep cohesion all channels can be locked to a master
channel forcing a loop to be an integer number of "Master Beats". Further a
visualization mode enables exploring the patterns in another new way. Using
synchronized OpenGL a 3-Dimensional environment visualizes the beats as
droplets falling from faucets of varying heights determined by the loop length.
Waves form in the bottom as beats splash into the virtual "sink". By combining
compelling visuals and a new approach to sequencing a new way of exploring
beats and experiencing music has been created. Keywords: NIME, proceedings, drum machine, sequencer, visualization | |||
| LoopJam: turning the dance floor into a collaborative instrumental map | | BIBAK | PDF | 260 | |
| Christian Frisson; Stéphane Dupont; Julien Leroy; Alexis Moinet; Thierry Ravet; Xavier Siebert; Thierry Dutoit | |||
| This paper presents the LoopJam installation which allows participants to
interact with a sound map using a 3D computer vision tracking system. The sound
map results from similarity-based clustering of sounds. The playback of these
sounds is controlled by the positions or gestures of participants tracked with
a Kinect depth-sensing camera. The beat-inclined bodily movements of
participants in the installation are mapped to the tempo of played sounds,
while the playback speed is synchronized by default among all sounds. We
presented and tested an early version of the installation to three exhibitions
in Belgium, Italy and France. The reactions among participants ranged between
curiosity and amusement. Keywords: Interactive music systems and retrieval, user interaction and interfaces,
audio similarity, depth sensors | |||
| PocoPoco: A Kinetic Musical Interface With Electro-Magnetic Levitation Units | | BIBAK | PDF | 232 | |
| Yuya Kikukawa; Takaharu Kanai; Tatsuhiko Suzuki; Toshiki Yoshiike; Tetsuaki Baba; Kumiko Kushiyama | |||
| We developed original solenoid actuator units with several built-in sensors,
and produced a box-shaped musical interface "PocoPoco" using 16 units of them
as a universal input/output device. We applied up-and-down movement of the
solenoid-units and user's intuitive input to musical interface. Using
transformation of the physical interface, we can apply movement of the units to
new interaction design. At the same time we intend to suggest a new interface
whose movement itself can attract the user. Keywords: musical interface, interaction design, tactile, moving, kinetic | |||
| Augmented Piano Performance using a Depth Camera | | BIBAK | PDF | 203 | |
| Qi Yang; Georg Essl | |||
| We augment the piano keyboard with a 3D gesture space using Microsoft Kinect
for sensing and top-down projection for visual feedback. This interface
provides multi-axial gesture controls to enable continuous adjustments to
multiple acoustic parameters such as those on the typical digital synthesizers.
We believe that using gesture control is more visceral and aesthetically
pleasing, especially during concert performance where the visibility of the
performer's action is important. Our system can also be used for other types of
gesture interaction as well as for pedagogical applications. Keywords: NIME, piano, depth camera, musical instrument, gesture, tabletop projection | |||
| TC-11: A Programmable Multi-Touch Synthesizer for the iPad | | BIBAK | PDF | 230 | |
| Kevin Schlei | |||
| This paper describes the design and realization of TC-11, a software
instrument based on programmable multi-point controllers. TC-11 is a modular
synthesizer for the iPad that uses multi-touch and device motion sensors for
control. It has a robust patch programming interface that centers around
multi-point controllers, providing powerful flexibility. This paper details the
origin, design principles, programming implementation, and performance result
of TC-11. Keywords: TC-11, iPad, multi-touch, multi-point, controller mapping, synthesis
programming | |||
| The Planetarium as a Musical Instrument | | BIBAK | PDF | 47 | |
| Dale Parson; Phillip Reed | |||
| With the advent of high resolution digital video projection and high quality
spatial sound systems in modern planetariums, the planetarium can become the
basis for a unique set of virtual musical instrument capabilities that go well
beyond packaged multimedia shows. The dome, circular speaker and circular
seating arrangements provide means for skilled composers and performers to
create a virtual reality in which attendees are immersed in the composite
instrument.
This initial foray into designing an audio-visual computer-based instrument for improvisational performance in a planetarium builds on prior, successful work in mapping the rules and state of two-dimensional computer board games to improvised computer music. The unique visual and audio geometries of the planetarium present challenges and opportunities. The game tessellates the dome in mobile, colored hexagons that emulate both atoms and musical scale intervals in an expanding universe. Spatial activity in the game maps to spatial locale and instrument voices in the speakers, in essence creating a virtual orchestra with a string section, percussion section, etc. on the dome. Future work includes distribution of game play via mobile devices to permit attendees to participate in a performance. This environment is open-ended, with great educational and aesthetic potential. Keywords: aleatory music, algorithmic improvisation, computer game, planetarium | |||
| SoundStrand: a Tangible Interface for Composing Music with Limited Degrees of Freedom | | BIBAK | PDF | 125 | |
| Eyal Shahar | |||
| SoundStrand is a tangible music composition tool. It demonstrates a paradigm
developed to enable music composition through the use of tangible interfaces.
This paradigm attempts to overcome the contrast between the relatively small of
amount degrees of freedom usually demonstrated by tangible interfaces and the
vast number of possibilities that musical composition presents.
SoundStrand is comprised of a set of physical objects called cells, each representing a musical phrase. Cells can be sequentially connected to each other to create a musical theme. Cells can also be physically manipulated to access a wide range of melodic, rhythmic and harmonic variations. The SoundStrand software assures that as the cells are manipulated, the melodic flow, harmonic transitions and rhythmic patterns of the theme remain musically plausible while preserving the user's intentions. Keywords: Tangible, algorithmic, composition, computer assisted | |||
| The Music Ball Project: Concept, Design, Development, Performance | | BIBAK | PDF | 161 | |
| Alexander Refsum Jensenius; Arve Voldsund | |||
| We report on the Music Ball Project, a longterm, exploratory project focused
on creating novel instruments/controllers with a spherical shape as the common
denominator. Besides a simple and attractive geometrical shape, balls afford
many different types of use, including play. This has made our music balls
popular among widely different groups of people, from toddlers to seniors,
including those that would not otherwise engage with a musical instrument. The
paper summarises our experience of designing, constructing and using a number
of music balls of various sizes and with different types of sound-producing
elements. Keywords: music balls, instruments, controllers, inexpensive | |||
| Many-Person Instruments for Computer Music Performance | | BIBAK | PDF | 171 | |
| Michael Rotondo; Nick Kruge; Ge Wang | |||
| In this paper we explore the concept of instruments which are played by more
than one person, and present two case studies. We designed, built and performed
with Feedbørk, a two-player instrument comprising two iPads which form a
video feedback loop, and Barrel, a nine-player instrument made up of eight
Gametrak controllers fastened to a steel industrial barrel. By splitting the
control of these instruments into distinct but interdependent roles, we allow
each individual to easily play a part while retaining a rich complexity of
output for the whole system. We found that the relationships between those
roles had a significant effect on how the players communicated with each other,
and on how the performance was perceived by the audience. Keywords: Many person musical instruments, cooperative music, asymmetric interfaces,
transmodal feedback | |||
| Kritaanjli: A Robotic Harmonium for Performance, Pedogogy and Research | | BIBAK | PDF | 99 | |
| Ajay Kapur; Jim Murphy; Dale Carnegie | |||
| In this paper, we introduce Kritaanjli, a robotic harmonium. Details
concerning the design, construction, and use of Kritaanjli are discussed. After
an examination of related work, quantitative research concerning the hardware
chosen in the construction of the instrument is shown, as is a thorough
exposition of the design process and use of CAD/CAM techniques in the design
lifecycle of the instrument. Additionally, avenues for future work and
compositional practices are focused upon, with particular emphasis placed on
human/robot interaction, pedagogical techniques afforded by the robotic
instrument, and compositional avenues made accessible through the use of
Kritaanjli. Keywords: Musical Robotics, pedagogy, North Indian Classical Music, augmented
instruments | |||
| Bubble Drum-agog-ing: Polyrhythm Games & Other Inter Activities | | BIBAK | PDF | 8 | |
| Jay Alan Jackson | |||
| This paper describes the bubble drum set, along with several polyrhythm
games and interactive music activities that have been developed to show its
potential for use as an input controller. The bubble drum set combines various
sizes of colorful exercise balls, held in place or suspended with conventional
drum hardware and thus creating a trap kit configuration in which the spherical
surfaces can be struck and stroked from varying angles using sticks, brushes,
or even by hands alone. The acoustic properties of these fitness balls are
surprisingly rich, capable of producing subtle differences in timbre while
being responsive over a wide dynamic range. The entire set has been
purposefully designed to provide a player with the means to achieve a rigorous
and healthy physical workout, in addition to the achieving beneficial cognitive
and sensory stimulation that comes from playing music with a sensitive and
expressive instrument. Keywords: Bubble Drums, WaveMachine Lab's Drumagog, Polyrhythms | |||
| The 'Interface' in Site-Specific Sound Installation | | BIBAK | PDF | 175 | |
| Kirsty Beilharz; Aengus Martin | |||
| In site-specific installation or situated media, a significant part of the
"I" in NIME is the environment, the site and the implicit features of site such
as humans, weather, materials, natural acoustics, etc. These could be viewed as
design constraints, or features, even agency determining the outcome of
responsive sound installation works. This paper discusses the notion of
interface in public (especially outdoor) installation, starting with the
authors' Sculpture by the Sea Windtraces work using this recent experience as
the launch-pad, with reference to ways in which others have approached it
(focusing on sensor, weather-activated outdoor installations in a brief
traverse of related cases, e.g. works by Garth Paine, James Bulley and Daniel
Jones, and David Bowen). This is a dialogical paper on the topic of interface
and 'site' as the aetiology of interaction/interface/instrument and its type of
response (e.g. to environment and audience). While the focus here is on outdoor
factors (particularly the climatic environment), indoor site-specific
installation also experiences the effects of ambient noise, acoustic context,
and audience as integral agents in the interface and perception of the work,
its musical expression. The way in which features of the situation are
integrated has relevance for others in the NIME community in the design of
responsive spaces, art installation, and large-scale or installed instruments
in which users, participants, acoustics play a significant role. Keywords: NIME, site-specific installation, outdoor sound installation | |||
| Non-invasive sensing and gesture control for pitched percussion hyper-instruments using the Kinect | | BIBA | PDF | 297 | |
| Shawn Trail; Michael Dean; Gabrielle Odowichuk; Tiago Fernandes Tavares; Peter Driessen; W. Andrew Schloss; George Tzanetakis | |||
| Hyper-instruments extend traditional acoustic instruments with sensing technologies that capture digitally subtle and sophisticated aspects of human performance. They leverage the long training and skills of performers while simultaneously providing rich possibilities for digital control. Many existing hyper-instruments suffer from being one of a kind instruments that require invasive modifications to the underlying acoustic instrument. In this paper we focus on the pitched percussion family and describe a non-invasive sensing approach for extending them to hyper-instruments. Our primary concern is to retain the technical integrity of the acoustic instrument and sound production methods while being able to intuitively interface the computer. This is accomplished by utilizing the Kinect sensor to track the position of the mallets without any modification to the instrument which enables easy and cheap replication of the proposed hyper-instrument extensions. In addition we describe two approaches to higher-level gesture control that remove the need for additional control devices such as foot pedals and fader boxes that are frequently used in electro-acoustic performance. This gesture control integrates more organically with the natural flow of playing the instrument providing user selectable control over filter parameters, synthesis, sampling, sequencing, and improvisation using a commercially available low-cost sensing apparatus. | |||
| Real-time Modification of Music with Dancer's Respiration Pattern | | BIBAK | PDF | 309 | |
| Jeong-seob Lee; Woon Seung Yeo | |||
| This research aims to improve the correspondence between music and dance,
and explores the use of human respiration pattern for musical applications with
focus on the motional aspect of breathing. While respiration is frequently
considered as an indicator of the metabolic state of human body that contains
meaningful information for medicine or psychology, motional aspect of
respiration has been relatively unnoticed in spite of its strong correlation
with muscles and the brain. This paper introduces an interactive system to
control music playback for dance performances based on the respiration pattern
of the dancer. A wireless wearable sensor device detects the dancer's
respiration, which is then utilized to modify the dynamic of music. Two
different respiration-dynamic mappings were designed and evaluated through
public performances and private tests by professional choreographers. Results
from this research suggest a new conceptual approach to musical applications of
respiration based on the technical characteristics of music and dance. Keywords: Music, dance, respiration, correspondence, wireless interface, interactive
performance | |||
| Performing experimental music by physical simulation | | BIBAK | PDF | 30 | |
| Julien Castet | |||
| This paper presents ongoing work on methods dedicated to relations between
composers and performers in the context of experimental music. The computer
music community has over the last decade paid a strong interest on various
kinds of gestural interfaces to control sound synthesis processes. The mapping
between gesture and sound parameters has specially been investigated in order
to design the most relevant schemes of sonic interaction. In fact, this
relevance results in an aesthetic choice that encroaches on the process of
composition. This work proposes to examine the relations between composers and
performers in the context of the new interfaces for musical expression. It aims
to define a theoretical and methodological framework clarifying these
relations. In this project, this paper is the first experimental study about
the use of physical models as gestural maps for the production of textural
sounds. Keywords: Simulation, Interaction, Sonic textures | |||
| Wireless Interactive Sensor Platform for Real-Time Audio-Visual Experience | | BIBAK | PDF | 98 | |
| Jia-Liang Lu; Da-Lei Fang; Yi Qin; Jiu-Qiang Tang | |||
| WIS platform is a wireless interactive sensor platform designed to support
dynamic and interactive applications. The platform consists of a capture system
which includes multiple on-body ZigBee compatible motion sensors, a processing
unit and an audio-visual display control unit. It has a complete open
architecture and provides interfaces to interact with other user-designed
applications. Therefore, WIS platform is highly extensible. Through gesture
recognitions by on-body sensor nodes and data processing, WIS platform can
offer real-time audio and visual experiences to the users. Based on this
platform, we set up a multimedia installation that presents a new interaction
model between the participants and the audio-visual environment. Furthermore,
we are also trying to apply WIS platform to other installations and
performances. Keywords: Interactive, Audio-visual experience | |||
| The Gesturally Extended Piano | | BIBAK | PDF | 102 | |
| William Brent | |||
| This paper introduces the Gesturally Extended Piano|an augmented instrument
controller that relies on information drawn from performer motion tracking in
order to control real-time audiovisual processing and synthesis. Specifically,
the positions, heights, velocities, and relative distances and angles of points
on the hands and forearms are followed. Technical details and installation of
the tracking system are covered, as well as strategies for interpreting and
mapping the resulting data in relation to synthesis parameters. Design factors
surrounding mapping choices and the interrelation between mapped parameters are
also considered. Keywords: Augmented instruments, controllers, motion tracking, mapping | |||
| Electric Slide Organistrum | | BIBAK | PDF | 114 | |
| Martin Piñeyro | |||
| The Electric Slide Organistrum (Figure 1) is an acoustic stringed instrument
played through a video capture system. The vibration of the instrument string
is generated electromagnetically and the pitch variation is achieved by
movements carried out by the player in front of a video camera. This instrument
results from integrating an ancient technique for the production of sounds as
it is the vibration of a string on a soundbox and actual human-computer
interaction technology such as motion detection. Keywords: Gestural Interface, eBow, Pickup, Bowed string, Electromagnetic actuation | |||
| NIME Education at the HKU, Emphasizing performance | | BIBAK | PDF | 247 | |
| Hans Leeuw; Jorrit Tamminga | |||
| This position paper likes to stress the role and importance of performance
based education in NIME like subjects. It describes the 'klankontwerp' learning
line at the 'school of the arts Utrecht' in its department Music Technology.
Our educational system also reflects the way that we could treat performance in
the NIME community as a whole. The importance of performing with our
instruments other then in the form of a mere demonstration should get more
emphasis. Keywords: NIME, education, position paper, live electronics, performance | |||
| First Life -- Imagining the Chemical Origins of Life | | BIB | - | |
| Steve Everett | |||
| Granular Learning Objects for Instrument Design and Collaborative Performance in K-12 Education | | BIBAK | PDF | 315 | |
| Ivica Bukvic; Liesl Baum; Bennett Layman; Kendall Woodard | |||
| In the following paper we propose a new tiered granularity approach to
developing modules or abstractions in the PdL2Ork visual multimedia programming
environment with the specific goal of devising creative environments that scale
their educational scope and difficulty to encompass several stages within the
context of primary and secondary (K-12) education. As part of a preliminary
study, the team designed modules targeting 4th and 5th grade students, the
primary focus being exploration of creativity and collaborative learning. The
resulting environment infrastructure -- coupled with the Boys & Girls Club
of Southwest Virginia Satellite Linux Laptop Orchestra -- offers opportunities
for students to design and build original instruments, master them through a
series of rehearsals, and ultimately utilize them as part of an ensemble in a
performance of a predetermined piece whose parameters are coordinated by
instructor through an embedded networked module. The ensuing model will serve
for the assessment and development of a stronger connection with content-area
standards and the development of creative thinking and collaboration skills. Keywords: Granular, Learning Objects, K-12, Education, L2Ork, PdL2Ork | |||
| Kinetic Light Drums / Community Beacons | | BIB | - | |
| Matthew McCormack; Jenn Figg | |||
| DIRTI -- Dirty Tangible Interfaces | | BIBAK | PDF | 212 | |
| Matthieu Savary; Diemo Schwarz; Denis Pellerin | |||
| Dirty Tangible Interfaces (DIRTI) are a new concept in interface design that
forgoes the dogma of repeatability in favor of a richer and more complex
experience, constantly evolving, never reversible, and infinitely modifiable.
We built a prototype based on granular or liquid interaction material placed in
a glass dish, that is analyzed by video tracking for its 3D relief. This
relief, and the dynamic changes applied to it by the user, are interpreted as
activation profiles to drive corpus-based concatenative sound synthesis,
allowing one or more players to mold sonic landscapes and to plow through them
in an inherently collaborative, expressive, and dynamic experience. Keywords: Tangible interface, Corpus-based concatenative synthesis, Nonstandard
interaction | |||
| Tweet Harp: Laser Harp Generating Voice and Text of Real-time Tweets in Twitter | | BIBAK | PDF | 66 | |
| Ayaka Endo; Takuma Moriyama; Yasuo Kuhara | |||
| Tweet Harp is a musical instrument using Twitter and a laser harp. This
instrument features the use of the human voice speaking tweets in Twitter as
sounds for music. It is played by touching the six harp strings of laser beams.
Tweet Harp gets the latest tweets from Twitter in real-time, and it creates
music like a song with unexpected words. It also creates animation displaying
the texts at the same time. The audience can visually enjoy this performance by
sounds synchronized with animation. If the audience has a Twitter account, they
can participate in the performance by tweeting. Keywords: Twitter, laser harp, text, speech, voice, AppleScript, Quartz Composer,
Max/MSP, TTS, Arduino | |||
| MAGE -- A Platform for Tangible Speech Synthesis | | BIBAK | PDF | 164 | |
| Maria Astrinaki; Nicolas d'Alessandro; Thierry Dutoit | |||
| In this paper, we describe our pioneering work in developing speech
synthesis beyond the Text-To-Speech paradigm. We introduce tangible speech
synthesis as an alternate way of envisioning how artificial speech content can
be produced. Tangible speech synthesis refers to the ability, for a given
system, to provide some physicality and interactivity to important speech
production parameters. We present MAGE, our new software platform for
high-quality reactive speech synthesis, based on statistical parametric
modeling and more particularly hidden Markov models. We also introduce a new
HandSketch-based musical instrument. This instrument brings pen and posture
based interaction on the top of MAGE, and demonstrates a first proof of
concept. Keywords: speech synthesis, Hidden Markov Models, tangible interaction, software
library, MAGE, HTS, performative | |||
| A Digital Mobile Choir: Joining Two Interfaces towards Composing and Performing Collaborative Mobile Music | | BIBAK | PDF | 310 | |
| Nicolas d'Alessandro; Aura Pon; Johnty Wang; David Eagle; Ehud Sharlin; Sidney Fels | |||
| We present the integration of two musical interfaces into a new music-making
system that seeks to capture the experience of a choir and bring it into the
mobile space. This system relies on three pervasive technologies that each
support a different part of the musical experience. First, the mobile device
application for performing with an artificial voice, called ChoirMob. Then, a
central composing and conducting application running on a local interactive
display, called Vuzik. Finally, a network protocol to synchronize the two.
ChoirMob musicians can perform music together at any location where they can
connect to a Vuzik central conducting device displaying a composed piece of
music. We explored this system by creating a chamber choir of ChoirMob
performers, consisting of both experienced musicians and novices, that
performed in rehearsals and live concert scenarios with music composed using
the Vuzik interface. Keywords: singing synthesis, mobile music, interactive display, interface design, OSC,
ChoirMob, Vuzik, social music, choir | |||
| Real-Time Music Notation, Collaborative Improvisation, and Laptop Ensembles | | BIBAK | PDF | 62 | |
| Sang Won Lee; Jason Freeman; Andrew Collela | |||
| This paper describes recent extensions to LOLC, a text-based environment for
collaborative improvisation for laptop ensembles, which integrate acoustic
instrumental musicians into the environment. Laptop musicians author short
commands to create, transform, and share pre-composed musical fragments, and
the resulting notation is digitally displayed, in real time, to instrumental
musicians to sight-read in performance. The paper describes the background and
motivations of the project, outlines the design of the original LOLC
environment and describes its new real-time notation components in detail, and
explains the use of these new components in a musical composition, SGLC, by one
of the authors. Keywords: Real-time Music Notation, Live Coding, Laptop Orchestra | |||
| Drum Stroke Computing: Multimodal Signal Processing for Drum Stroke Identification and Performance Metrics | | BIBAK | PDF | 82 | |
| Jordan Hochenbaum; Ajay Kapur | |||
| In this paper we present a multimodal system for analyzing drum performance.
In the first example we perform automatic drum hand recognition utilizing a
technique for automatic labeling of training data using direct sensors, and
only indirect sensors (e.g. a microphone) for testing. Left/Right drum hand
recognition is achieved with an average accuracy of 84.95% for two performers.
Secondly we provide a study investigating multimodality dependent performance
metrics analysis. Keywords: Multimodality, Drum stroke identification, surrogate sensors, surrogate data
training, machine learning, music information retrieval, performance metrics | |||
| A Comparative User Study of Two Methods of Control on a Multi-Touch Surface for Musical Expression | | BIBAK | PDF | 94 | |
| Blake Johnston; Owen Vallis; Ajay Kapur | |||
| Mapping between musical interfaces, and sound engines, is integral to the
nature of an interface [3]. Traditionally, musical applications for touch
surfaces have directly mapped touch coordinates to control parameters. However,
recent work [9] is looking at new methods of control that use relational
multi-point analysis. Instead of directly using touch coordinates, which are
related to a global screen space, an initial touch is used as an 'anchor' to
create a local coordinate space in which subsequent touches can be located and
compared. This local coordinate space frees touches from being locked to one
single relationship, and allows for more complex interaction between touch
events. So far, this method has only been implemented on Apple computer's small
capacitive touch pads. Additionally, there has yet to be a user study that
directly compares [9] against mappings of touch events within global coordinate
spaces. With this in mind, we have developed and evaluated two interfaces with
the aim of determining and quantifying some of these differences within the
context of our custom large multitouch surfaces [1]. Keywords: Multi-Touch, User Study, Relational-point interface | |||
| Tok!: A Collaborative Acoustic Instrument using Mobile Phones | | BIBAK | PDF | 61 | |
| Sang Won Lee; Ajay Srinivasamurthy; Gregoire Tronel; Weibin Shen; Jason Freeman | |||
| Tok! is a collaborative acoustic instrument application for iOS devices
aimed at real time percussive music making in a colocated setup. It utilizes
the mobility of hand-held devices and transforms them into drumsticks to tap on
flat surfaces and produce acoustic music. Tok! is also networked and consists
of a shared interactive music score to which the players tap their phones,
creating a percussion ensemble. Through their social interaction and real-time
modifications to the music score, and through their creative selection of
tapping surfaces, the players can collaborate and dynamically create
interesting rhythmic music with a variety of timbres. Keywords: Mobile Phones, Collaboration, Social Interaction, Acoustic Musical
Instrument | |||
| A Reactive Environment for Dynamic Volume Control | | BIBA | PDF | 88 | |
| Dalia El-Shimy; Thomas Hermann; Jeremy Cooperstock | |||
| In this paper, we discuss the design and testing of a reactive environment for musical performance. Driven by the interpersonal interactions amongst musicians, our system gives users, i.e., several musicians playing together in a band, real-time control over certain aspects of their performance, enabling them to change volume levels dynamically simply by moving around. It differs most notably from the majority of ventures into the design of novel musical interfaces and installations in its multidisciplinary approach, drawing on techniques from Human-Computer Interaction, social sciences and ludology. Our User-Centered Design methodology was central to producing an interactive environment that enhances traditional performance with novel functionalities. During a formal experiment, musicians reported finding our system exciting and enjoyable. We also introduce some additional interactions that can further enhance the interactivity of our reactive environment. In describing the particular challenges of working with such a unique and creative user as the musician, we hope that our approach can be of guidance to interface developers working on applications of a creative nature. | |||
| Palm-area sensitivity to vibrotactile stimuli above 1 kHz | | BIBAK | PDF | 105 | |
| Lonce Wyse; Suranga Nanayakkara; Paul Seekings; Sim Heng Ong; Elizabeth Taylor | |||
| The upper limit of frequency sensitivity for vibrotactile stimulation of the
fingers and hand is commonly accepted as 1 kHz. However, during the course of
our research to develop a full-hand vibrotactile musical communication device
for the hearing-impaired, we repeatedly found evidence suggesting sensitivity
to higher frequencies. Most of the studies on which vibrotactile sensitivity
are based have been conducted using sine tones delivered by point-contact
actuators. The current study was designed to investigate vibrotactile
sensitivity using complex signals and full, open-hand contact with a flat
vibrating surface representing more natural environmental conditions.
Sensitivity to frequencies considerably higher than previously reported was
demonstrated for all the signal types tested. Furthermore, complex signals seem
to be more easily detected than sine tones, especially at low frequencies. Our
findings are applicable to a general understanding of sensory physiology, and
to the development of new vibrotactile display devices for music and other
applications. Keywords: Haptic Sensitivity, Hearing-impaired, Vibrotactile Threshold | |||
| WLAN trilateration for musical echolocation in the installation 'The Network Is A Blind Space' | | BIBAK | PDF | 142 | |
| Stelios Manousakis | |||
| This paper presents the system and technology developed for the distributed,
micro-telematic, interactive sound art installation, The Network Is A Blind
Space. The piece uses sound to explore the physical yet invisible
electromagnetic spaces created by Wireless Local Area Networks (WLANs). To this
end, the author created a framework for indoor WiFi localization, providing a
variety of control data for various types of 'musical echolocation'. This data,
generated mostly by visitors exploring the installation while holding
WiFi-enabled devices, is used to convey the hidden properties of wireless
networks as dynamic spaces through an artistic experience. Keywords: Network music, mobile music, distributed music, interactivity, sound art
installation, collaborative instrument, site-specific, electromagnetic signals,
WiFi, trilateration, traceroute, echolocation, SuperCollider, Pure Data, RjDj,
mapping | |||
| Strategies for Engagement in Computer-Mediated Musical Performance | | BIBAK | PDF | 162 | |
| James Nesfield | |||
| A general strategy for encouraging embodied engagement within musical
interface design is introduced. A pair of example implementations of this
strategy are described, one tangible and one graphical. As part of a
potentially larger set within our general approach, two separate relationships
are described termed 'decay and contribution' and 'instability and adjustment',
which are heavily dependent on the action requirements and timeliness of the
interaction. By suggesting this process occurs on a timescale of less than one
second it is hoped attentiveness and engagement can be encouraged to the
possible benefit of future developments in digital musical instrument design. Keywords: engagement, embodiment, flow, decay, instability, design, NIME | |||
| EnActor: A Blueprint for a Whole Body Interaction Design Software Platform | | BIBAK | PDF | 169 | |
| Vangelis Lympouridis | |||
| Through a series of collaborative research projects using Orient, a
wireless, inertial sensor-based motion capture system, I have studied the
requirements of musicians, dancers, performers and choreographers and
identified various design strategies for the realization of Whole Body
Interactive (WBI) performance systems. The acquired experience and knowledge
led to the design and development of EnActor, prototype Whole Body Interaction
Design software. The software has been realized as a collection of modules that
were proved valuable for the design of interactive performance systems that are
directly controlled by the body.
This paper presents EnActor's layout as a blueprint for the design and development of more sophisticated descendants. Complete video archive of my research projects in WBI performance systems at: http://www.inter-axions.com Keywords: Whole Body Interaction, Motion Capture, Interactive Performance Systems,
Interaction Design, Software Prototype | |||
| Considering Audience's View Towards an Evaluation Methodology for Digital Musical Instruments | | BIBAK | PDF | 174 | |
| Jerônimo Barbosa; Filipe Calegario; Verônica Teichrieb; Geber Ramalho; Patrick McGlynn | |||
| The authors propose the development of a more complete Digital Music
Instrument (DMI) evaluation methodology, which provides structured tools for
the incremental development of prototypes based on user feedback. This paper
emphasizes an important but often ignored stakeholder present in the context of
musical performance: the audience. We demonstrate the practical application of
an audience focused methodology through a case study ('Illusio'), discuss the
obtained results and possible improvements for future works. Keywords: Empirical methods, quantitative, usability testing and evaluation, digital
musical instruments, evaluation methodology, Illusio | |||
| Development and Evaluation of a ZigFlea-based Wireless Transceiver Board for CUI32 | | BIBAK | PDF | 205 | |
| Jim Torresen; Øyvind N. Hauback; Dan Overholt; Alexander Refsum Jensenius | |||
| We present a new wireless transceiver board for the CUI32 sensor interface,
aimed at creating a solution that is flexible, reliable, and with little power
consumption. Communication with the board is based on the ZigFlea protocol and
it has been evaluated on a CUI32 using the StickOS operating system.
Experiments show that the total sensor data collection time is linearly
increasing with the number of sensor samples used. A data rate of 0.8 kbit/s is
achieved for wirelessly transmitting three axes of a 3D accelerometer. Although
this data rate is low compared to other systems, our solution benefits from
ease-of-use and stability, and is useful for applications that are not
time-critical. Keywords: wireless sensing, CUI32, StickOS, ZigBee, ZigFlea | |||
| Perfect Take: Experience design and new interfaces for musical expression | | BIBAK | PDF | 208 | |
| Nicolas Makelberge; Álvaro Barbosa; André Perrotta; Luís Sarmento Ferreira | |||
| "Perfect Take" is a public installation out of networked acoustic
instruments that let composers from all over the world exhibit their MIDI-works
by means of the Internet. The primary aim of this system is to offer composers
a way to have works exhibited and recorded in venues and with technologies not
accessible to him/her under normal circumstances. The Secondary aim of this
research is to highlight experience design as a complement to interaction
design, and a shift of focus from functionality of a specific gestural
controller, towards the environments, events and processes that they are part
of. Keywords: NIME, Networked Music, MIDI, Disklavier, music collaboration, creativity | |||
| A Customizable Sensate Surface for Music Control | | BIBAK | PDF | 201 | |
| Nan-Wei Gong; Nan Zhao; Joseph Paradiso | |||
| This paper describes a novel music control sensate surface, which enables
integration between any musical instruments with a versatile, customizable, and
essentially cost-effective user interface. This sensate surface is based on
conductive inkjet printing technology which allows capacitive sensor electrodes
and connections between electronics components to be printed onto a large roll
of flexible substrate that is unrestricted in length. The high dynamic range
capacitive sensing electrodes can not only infer touch, but near-range,
non-contact gestural nuance in a music performance. With this sensate surface,
users can "cut" out their desired shapes, "paste" the number of inputs, and
customize their controller interface, which can then send signals wirelessly to
effects or software synthesizers. We seek to find a solution for integrating
the form factor of traditional music controllers seamlessly on top of one's
music instrument and meanwhile adding expressiveness to the music performance
by sensing and incorporating movements and gestures to manipulate the musical
output. We present an example of implementation on an electric ukulele and
provide several design examples to demonstrate the versatile capabilities of
this system. Keywords: Sensate surface, music controller skin, customizable controller surface,
flexible electronics | |||
| LOLbot: Machine Musicianship in Laptop Ensembles | | BIBAK | PDF | 119 | |
| Sidharth Subramanian; Jason Freeman; Scott McCoid | |||
| This paper describes a recent addition to LOLC, a text-based environment for
collaborative improvisation for laptop ensembles, incorporating a machine
musician that plays along with human performers. The machine musician LOLbot
analyses the patterns created by human performers and the composite music they
create as they are layered in performance. Based on user specified settings,
LOLbot chooses appropriate patterns to play with the ensemble, either to add
contrast to the existing performance or to be coherent with the rhythmic
structure of the performance. The paper describes the background and
motivations of the project, outlines the design of the original LOLC
environment and describes the architecture and implementation of LOLbot. Keywords: Machine Musicianship, Live Coding, Laptop Orchestra | |||
| Kugelschwung -- a Pendulum-based Musical Instrument | | BIBAK | PDF | 131 | |
| Jamie Henson; Benjamin Collins; Alexander Giles; Kathryn Webb; Matthew Livingston; Thomas Mortensson | |||
| This paper introduces the concept of Kugelschwung, a digital musical
instrument centrally based around the use of pendulums and lasers to create
unique and highly interactive electronic ambient soundscapes. Here, we explore
the underlying design and physical construction of the instrument, as well as
its implementation and feasibility as an instrument in the real world. To
conclude, we outline potential expansions to the instrument, describing how its
range of applications can be extended to accommodate a variety of musical
styles. Keywords: laser, pendulums, instrument design, electronic, sampler, soundscape,
expressive performance | |||
| A Dimension Space for Evaluating Collaborative Musical Performance Systems | | BIBAK | PDF | 150 | |
| Ian Hattwick; Marcelo M. Wanderley | |||
| The configurability and networking abilities of digital musical instruments
increases the possibilities for collaboration in musical performances. Computer
music ensembles such as laptop orchestras are becoming increasingly common and
provide laboratories for the exploration of these possibilities. However, much
of the literature regarding the creation of DMIs has been focused on individual
expressivity, and their potential for collaborative performance has been
under-utilized. This paper makes the case for the benefits of an approach to
digital musical instrument design that begins with their collaborative
potential, examines several frameworks and sets of principles for the creation
of digital musical instruments, and proposes a dimension space representation
of collaborative approaches which can be used to evaluate and guide future DMI
creation. Several examples of DMIs and compositions are then evaluated and
discussed in the context of this dimension space. Keywords: dimension space, collaborative, digital musical instrument, dmi, digital
music ensemble, dme | |||
| Using a seeing/blindfolded paradigm to study audience experiences of live-electronic performances with voice | | BIBAK | PDF | 168 | |
| Andreas Bergsland; Tone Åse | |||
| As a part of the research project Voice Meetings, a solo live-electronic
vocal performance was presented for 63 students. Through a mixed method
approach applying both written and oral response, feedback from one blindfolded
and one seeing audience group was collected and analyzed. There were marked
differences between the groups regarding focus, in that the participants in
blindfolded group tended to focus on fewer aspects, have a heightened focus and
be less distracted than the seeing group. The seeing group, on its part,
focused more on the technological instruments applied in the performance, the
performer herself and her actions. This study also shows that there were only
minor differences between the groups regarding the experience of skill and
control, and argues that this observation can be explained by earlier research
on skill in NIMEs. Keywords: Performance, audience reception, acousmatic listening, live-electronics,
voice, qualitative research | |||
| Exploring audio and tactile qualities of instrumentality with bowed string simulations | | BIBAK | PDF | 243 | |
| Olivier Tache; Stephen Sinclair; Jean-Loup Florens; Marcelo Wanderley | |||
| Force-feedback and physical modeling technologies now allow to achieve the
same kind of relation with virtual instruments as with acoustic instruments,
but the design of such elaborate models needs guidelines based on the study of
the human sensory-motor system and behaviour. This article presents a
qualitative study of a simulated instrumental interaction in the case of the
virtual bowed string, using both waveguide and mass-interaction models.
Subjects were invited to explore the possibilities of the simulations and to
express themselves verbally at the same time, allowing us to identify key
qualities of the proposed systems that determine the construction of an
intimate and rich relationship with the users. Keywords: Instrumental interaction, presence, force-feedback, physical modeling,
simulation, haptics, bowed string | |||
| Optoelectronic Acquisition and Control Board for Musical Applications | | BIBAK | PDF | 228 | |
| Avrum Hollinger; Marcelo M. Wanderley | |||
| A modular and reconfigurable hardware platform for analog optoelectronic
signal acquisition is presented. Its intended application is for fiber optic
sensing in electronic musical interfaces, however the flexible design enables
its use with a wide range of analog and digital sensors. Multiple gain and
multiplexing stages as well as programmable analog and digital hardware blocks
allow for the acquisition, processing, and communication of single-ended and
differential signals. Along with a hub board, multiple acquisition boards can
be connected to modularly extend the system's capabilities to suit the needs of
the application. Fiber optic sensors and their application in DMIs are briefly
discussed, as well as the use of the hardware platform with specific musical
interfaces. Keywords: fiber optic sensing, analog signal acquisition, musical interface,
MRI-compatible | |||
| Bowing a vibration-enhanced force feedback device | | BIBAK | PDF | 37 | |
| Marcello Giordano; Stephen Sinclair; Marcelo M. Wanderley | |||
| Force-feedback devices can provide haptic feedback during interaction with
physical models for sound synthesis. However, low-end devices may not always
provide high-fidelity display of the acoustic characteristics of the model.
This article describes an enhanced handle for the Phantom Omni containing a
vibration actuator intended to display the high-frequency portion of the
synthesized forces. Measurements are provided to show that this approach
achieves a more faithful representation of the acoustic signal, overcoming
limitations in the device control and dynamics. Keywords: Haptics, force feedback, bowing, audio, interaction | |||
| DIY Hybrid Analog/Digital Modular Synthesis | | BIBAK | PDF | 9 | |
| Greg Surges | |||
| This paper describes three hardware devices for integrating modular
synthesizers with computers, each with a different approach to the relationship
between hardware and software. The devices discussed are the USB-Octomod, an
8-channel OSC-compatible computer-controlled control-voltage generator, the
tabulaRasa, a hardware table-lookup oscillator synthesis module with
corresponding waveform design software, and the pucktronix.snake.corral, a dual
8x8 computer-controlled analog signal routing matrix. The devices make use of
open-source hardware and software, and are designed around affordable
micro-controllers and integrated circuits. Keywords: modular synthesis, interface, diy, open-source | |||
| Patchwork: Multi-User Network Control of a Massive Modular Synthesizer | | BIBAK | PDF | 293 | |
| Brian Mayton; Gershon Dublon; Nicholas Joliat; Joseph A. Paradiso | |||
| We present Patchwerk, a networked synthesizer module with tightly coupled
web browser and tangible interfaces. Patchwerk connects to a pre-existing
modular synthesizer using the emerging cross-platform HTML5 WebSocket standard
to enable low-latency, high-bandwidth, concurrent control of analog signals by
multiple users. Online users control physical outputs on a custom-designed
cabinet that reflects their activity through a combination of motorized knobs
and LEDs, and streams the resultant audio. In a typical installation, a
composer creates a complex physical patch on the modular synth that exposes a
set of analog and digital parameters (knobs, buttons, toggles, and triggers) to
the web-enabled cabinet. Both physically present and online audiences can
control those parameters, simultaneously seeing and hearing the results of each
other's actions. By enabling collaborative interaction with a massive analog
synthesizer, Patchwerk brings a broad audience closer to a rare and
historically important instrument. Patchwerk is available online at
http://synth.media.mit.edu. Keywords: Modular synthesizer, HTML5, tangible interface, collaborative musical
instrument | |||
| The Emotion in Motion Experiment: Using an Interactive Installation as a Means for Understanding Emotional Response to Music | | BIBAK | PDF | 254 | |
| Javier Jaimovich; Miguel Ortiz; Niall Coghlan; R. Benjamin Knapp | |||
| In order to further understand our emotional reaction to music, a
museum-based installation was designed to collect physiological and self-report
data from people listening to music. This demo will describe the technical
implementation of this installation as a tool for collecting large samples of
data in public spaces. The Emotion in Motion terminal is built upon a standard
desktop computer running Max/MSP and using sensors that measure physiological
indicators of emotion that are connected to an Arduino. The terminal has been
installed in museums and galleries in Europe and the USA, helping create the
largest database of physiology and self-report data while listening to music. Keywords: Biosignals, EDA, SC, GSR, HR, POX, Self-Report, Database, Physiological
Signals, Max/MSP, FTM, SAM, GEMS | |||
| Recontextualizing the Multi-touch Surface | | BIBAK | PDF | 132 | |
| Patrick McGlynn; Victor Lazzarini; Gordon Delap; Xiaoyu Chen | |||
| This paper contends that the development of expressive performance
interfaces using multi-touch technology has been hindered by an over-reliance
upon GUI paradigms. Despite offering rich and robust data output and multiple
ways to interpret it, approaches towards using multi-touch technology in
digital musical instrument design have been markedly conservative, showing a
strong tendency towards modeling existing hardware. This not only negates many
of the benefits of multi-touch technology but also creates specific
difficulties in the context of live music performance. A case study of two
other interface types that have seen considerable musical use -- the XY pad and
button grid -- illustrates the manner in which the implicit characteristics of
a device determine the conditions under which it will favorably perform.
Accordingly, this paper proposes an alternative approach to multi-touch which
emphasizes the implicit strengths of the technology and establishes a
philosophy of design around them. Finally, we introduce two toolkits currently
being used to assess the validity of this approach. Keywords: Multi-touch, controllers, mapping, gesture, GUIs, physical interfaces,
perceptual & cognitive issues | |||
| TedStick: A Tangible Electrophonic Drumstick | | BIBAK | PDF | 96 | |
| Cory Levinson | |||
| TedStick is a new wireless musical instrument that processes acoustic sounds
resonating within its wooden body and manipulates them via gestural movements.
The sounds are transduced by a piezoelectric sensor inside the wooden body, so
any tactile contact with TedStick is transmitted as audio and further processed
by a computer. The main method for performing with TedStick focuses on
extracting diverse sounds from within the resonant properties of TedStick
itself. This is done by holding TedStick in one hand and a standard drumstick
in the opposite hand while tapping, rubbing, or scraping the two against each
other. Gestural movements of TedStick are then mapped to parameters for several
sound effects including pitch shift, delay, reverb and low/high pass filters.
Using this technique the hand holding the drumstick can control the acoustic
sounds/interaction between the sticks while the hand holding TedStick can focus
purely on controlling the sound manipulation and effects parameters. Keywords: tangible user interface, piezoelectric sensors, gestural performance,
digital sound manipulation | |||
| Approaches to Interaction in a Digital Music Ensemble | | BIBAK | PDF | 153 | |
| Ian Hattwick; Kojiro Umezaki | |||
| The Physical Computing Ensemble was created in order to determine the
viability of an approach to musical performance which focuses on the
relationships and interactions of the performers. Three performance systems
utilizing gestural controllers were designed and implemented, each with a
different strategy for performer interaction. These strategies took advantage
of the opportunities for collaborative performance inherent in digital musical
instruments due to their networking abilities and reconfigurable software.
These characteristics allow for the easy implementation of varying approaches
to collaborative performance. Ensembles who utilize digital musical instruments
provide a fertile environment for the design, testing, and utilization of
collaborative performance systems. The three strategies discussed in this paper
are the parameterization of musical elements, turn-based collaborative control
of sound, and the interaction of musical systems created by multiple
performers. Design principles, implementation, and a performance using these
strategies are discussed, and the conclusion is drawn that performer
interaction and collaboration as a primary focus for system design,
composition, and performance is viable. Keywords: Collaborative performance, interaction, digital musical instruments,
gestural controller, digital music ensemble, Wii | |||
| Two Shared Rapid Turn Taking Sound Interfaces for Novices | | BIBAK | PDF | 123 | |
| Anne-Marie Hansen; Hans Jørgen Andersen; Pirkko Raudaskoski | |||
| This paper presents the results of user interaction with two explorative
music environments (sound system A and B) that were inspired from the Banda
Linda music tradition in two different ways. The sound systems adapted to how a
team of two players improvised and made a melody together in an interleaved
fashion: Systems A and B used a fuzzy logic algorithm and pattern recognition
to respond with modifications of a background rhythms. In an experiment with a
pen tablet interface as the music instrument, users aged 10-13 were to tap
tones and continue each other's melody. The sound systems rewarded users
sonically, if they managed to add tones to their mutual melody in a rapid turn
taking manner with rhythmical patterns. Videos of experiment sessions show that
user teams contributed to a melody in ways that resemble conversation.
Interaction data show that each sound system made player teams play in
different ways, but players in general had a hard time adjusting to a
non-Western music tradition. The paper concludes with a comparison and
evaluation of the two sound systems. Finally it proposes a new approach to the
design of collaborative and shared music environments that is based on
"listening applications". Keywords: Music improvisation, novices, social learning, interaction studies,
interaction design | |||
| Mobile Controls On-The-Fly: An Abstraction for Distributed NIMEs | | BIBAK | PDF | 303 | |
| Charles Roberts; Graham Wakefield; Matt Wright | |||
| Designing mobile interfaces for computer-based musical performance is
generally a time-consuming task that can be exasperating for performers.
Instead of being able to experiment freely with physical interfaces'
affordances, performers must spend time and attention on non-musical tasks
including network configuration, development environments for the mobile
devices, defining OSC address spaces, and handling the receipt of OSC in the
environment that will control and produce sound. Our research seeks to overcome
such obstacles by minimizing the code needed to both generate and read the
output of interfaces on mobile devices. For iOS and Android devices, our
implementation extends the application Control to use a simple set of OSC
messages to define interfaces and automatically route output. On the desktop,
our implementations in Max/MSP/Jitter, LuaAV, and SuperCollider allow users to
create mobile widgets mapped to sonic parameters with a single line of code. We
believe the fluidity of our approach will encourage users to incorporate mobile
devices into their everyday performance practice. Keywords: NIME, OSC, Zeroconf, iOS, Android, Max/MSP/Jitter, LuaAV, SuperCollider,
Mobile | |||
| Musician Maker: Play expressive music without practice | | BIBAK | PDF | 36 | |
| John Buschert | |||
| Musician Maker is a system to allow novice players the opportunity to create
expressive improvisational music. While the system plays an accompaniment
background chord progression, each participant plays some kind of controller to
make music through the system. The program takes the signals from the
controllers and adjusts the pitches somewhat so that the players are limited to
notes which fit the chord progression. The various controllers are designed to
be very easy and intuitive so anyone can pick one up and quickly be able to
play it. Since the computer is making sure that wrong notes are avoided, even
inexperienced players can immediately make music and enjoy focusing on some of
the more expressive elements and thus become musicians. Keywords: Musical Instrument, Electronic, Computer Music, Novice, Controller | |||
| FutureGrab: A wearable subtractive synthesizer using hand gesture | | BIBAK | PDF | 209 | |
| Yoonchang Han; Jinsoo Na; Kyogu Lee | |||
| FutureGrab is a new wearable musical instrument for live performance that is
highly intuitive while still generating an interesting sound by subtractive
synthesis. Its sound effects resemble the human vowel pronunciation, which were
mapped to hand gestures that are similar to the mouth shape of human to
pronounce corresponding vowel. FutureGrab also provides all necessary features
for a lead musical instrument such as pitch control, trigger, glissando and key
adjustment. In addition, pitch indicator was added to give visual feedback to
the performer, which can reduce the mistakes during live performances. This
paper describes the motivation, system design, mapping strategy and
implementation of FutureGrab, and evaluates the overall experience. Keywords: Wearable musical instrument, Pure Data, gestural synthesis, formant
synthesis, data-glove, visual feedback, subtractive synthesis | |||
| Designing for Cumulative Interactivity: The _derivations System | | BIBAK | PDF | 292 | |
| Benjamin Carey | |||
| This paper presents the author's _derivations system, an interactive
performance system for solo improvising instrumentalist. The system makes use
of a combination of real-time audio analysis, live sampling and spectral
re-synthesis to build a vocabulary of possible performative responses to live
instrumental input throughout an improvisatory performance. A form of timbral
matching is employed to form a link between the live performer and an expanding
database of musical materials. In addition, the system takes into account the
unique nature of the rehearsal/practice space in musical performance through
the implementation of performer-configurable cumulative rehearsal databases
into the final design. This paper discusses the system in detail with reference
to related work in the field, making specific reference to the system's
interactive potential both inside and outside of a real-time performance
context. Keywords: Interactivity, performance systems, improvisation | |||
| Crossole: A Gestural Interface for Composition, Improvisation and Performance using Kinect | | BIBAK | PDF | 185 | |
| Sertan Sentürk; Sang Won Lee; Avinash Sastry; Anosh Daruwalla; Gil Weinberg | |||
| Meaning crossword of sound, Crossole is a musical metainstrument where the
music is visualized as a set of virtual blocks that resemble a crossword
puzzle. In Crossole, the chord progressions are visually presented as a set of
virtual blocks. With the aid of the Kinect sensing technology, a performer
controls music by manipulating the crossword blocks using hand movements. The
performer can build chords in the high level, traverse over the blocks, step
into the low level to control the chord arpeggiations note by note, loop a
chord progression or map gestures to various processing algorithms to enhance
the timbral scenery. Keywords: Kinect, meta-instrument, chord progression, body gesture | |||
| From the Eyes to the Ears | | BIB | - | |
| Zacharias Vamvakousis | |||
| Designing Mappings for Musical Interfaces Using Preset Interpolation | | BIBAK | PDF | 159 | |
| Martin Marier | |||
| A new method for interpolating between presets is described. The
interpolation algorithm called Intersecting N-Spheres Interpolation is simple
to compute and its generalization to higher dimensions is straightforward. The
current implementation in the SuperCollider environment is presented as a tool
that eases the design of many-to-many mappings for musical interfaces. Examples
of its uses, including such mappings in conjunction with a musical interface
called the sponge, are given and discussed. Keywords: Mapping, Preset, Interpolation, Sponge, SuperCollider | |||
| Concept Tahoe: Microphone Midi Control | | BIBAK | PDF | 202 | |
| Dan Moses Schlessinger | |||
| We have developed a prototype wireless microphone that provides vocalists
with control over their vocal effects directly from the body of the microphone.
A wireless microphone has been augmented with six momentary switches, one
fader, and three axes of motion and position sensors, all of which provide MIDI
output from the wireless receiver. The MIDI data is used to control external
vocal effects units such as live loopers, reverbs, distortion pedals, etc. The
goal was to provide dramatically increased expressive control to vocal
performances, and address some of the shortcomings of pedal-controlled effects.
The addition of gestural controls from the motion sensors opens up new
performance possibilities such as panning the voice simply by pointing the
microphone in one direction or another. The result is a hybrid
microphone-musical instrument which has received extremely positive results
from vocalists in numerous informal workshops. Keywords: NIME, Sennheiser, Concept Tahoe, MIDI, control, microphone | |||
| The Deckle Project: A Sketch of Three Sensors | | BIBAK | PDF | 214 | |
| Hongchan Choi; John Granzow; Joel Sadler | |||
| The Deckle Group is an ensemble that designs, builds and performs on
electroacoustic drawing boards. These drawing surfaces are augmented with
Satellite CCRMA BeagleBoards and Arduinos.[1] Piezo microphones are used in
conjunction with other sensors to produce sounds that are coupled tightly to
mark-making gestures. Position tracking is achieved with infra-red object
tracking, conductive fabric and a magnetometer. Keywords: Deckle, BeagleBoard, Drawing, Sonification, Performance, Audiovisual,
Gestural Interface | |||
| Instant Instrument Anywhere: A Self-Contained Capacitive Synthesizer | | BIBAK | PDF | 223 | |
| David Gerhard; Brett Park | |||
| The Instant Instrument Anywhere (IIA) is a small device which can be
attached to any metal object to create an electronic instrument. The device
uses capacitive sensing to detect proximity of the player's body to the metal
object, and sound is generated through a surface transducer which can be
attached to any flat surface. Because the capacitive sensor can be any shape or
size, absolute capacitive thresholding is not possible since the baseline
capacitance will change. Instead, we use a differential-based moving sum
threshold which can rapidly adjust to changes in the environment or be
re-calibrated to a new metal object. We show that this dynamic threshold is
effective in rejecting environmental noise and rapidly adapting to new objects.
We also present details for constructing Instant Instruments Anywhere,
including using smartphone as the synthesis engine and power supply. Keywords: Capacitive Sensing, Arduino | |||
| Node and Message Management with the JunctionBox Interaction Toolkit | | BIBAK | PDF | 299 | |
| Lawrence Fyfe; Adam Tindale; Sheelagh Carpendale | |||
| Message mapping between control interfaces and sound engines is an important
task that could benefit from tools that streamline development. A new Open
Sound Control (OSC) namespace called Nexus Data Exchange Format (NDEF)
streamlines message mapping by offering developers the ability to manage sound
engines as network nodes and to query those nodes for the messages in their OSC
address spaces. By using NDEF, developers will have an easier time managing
nodes and their messages, especially for scenarios in which a single
application or interface controls multiple sound engines. NDEF is currently
implemented in the JunctionBox interaction toolkit but could easily be
implemented in other toolkits. Keywords: OSC, namespace, interaction, node | |||
| Empathetic Interactive Music Video Experience | | BIBAK | PDF | 179 | |
| Myunghee Lee; Youngsun Kim; Gerard Kim | |||
| Empatheater is a video playing system that is controlled by multimodal
interaction. As the video is played, the user must interact and emulate
predefined "events" for the video to continue on. The user is given the
illusion of playing an active role in the unraveling video content and can
empathize with the performer. In this paper, we report about user experiences
with Empatheater when applied to musical video contents. Keywords: Music video, Empathy, Interactive video, Musical event, Multimodal
interaction | |||
| The Fingerphone: a Case Study of Sustainable Instrument Redesign | | BIBAK | PDF | 264 | |
| Adrian Freed | |||
| The Fingerphone, a reworking of the Stylophone in conductive paper, is
presented as an example of new design approaches for sustainability and
playability of electronic musical instruments. Keywords: Stylophone, Conductive Paper, Pressure Sensing, Touch Sensing, Capacitive
Sensing, Plurifunctionality, Fingerphone, Sustainable Design | |||