| The Overtone Fiddle: an Actuated Acoustic Instrument | | BIBAK | PDF | 4-7 | |
| Dan Overholt | |||
| The Overtone Fiddle is a new violin-family instrument that incorporates
electronic sensors, integrated DSP, and physical actuation of the acoustic
body. An embedded tactile sound transducer creates extra vibrations in the body
of the Overtone Fiddle, allowing performer control and sensation via both
traditional violin techniques, as well as extended playing techniques that
incorporate shared man/machine control of the resulting sound. A magnetic
pickup system is mounted to the end of the fiddle's fingerboard in order to
detect the signals from the vibrating strings, deliberately not capturing
vibrations from the full body of the instrument. This focused sensing approach
allows less restrained use of DSP-generated feedback signals, as there is very
little direct leakage from the actuator embedded in the body of the instrument
back to the pickup. Keywords: Actuated Musical Instruments, Hybrid Instruments, Active Acoustics,
Electronic Violin | |||
| A Low-Cost, Low-Latency Multi-Touch Table with Haptic Feedback for Musical Applications | | BIBAK | PDF | 8-13 | |
| Matthew Montag; Stefan Sullivan; Scott Dickey; Colby Leider | |||
| During the past decade, multi-touch surfaces have emerged as valuable tools
for collaboration, display, interaction, and musical expression. Unfortunately,
they tend to be costly and often suffer from two drawbacks for music
performance: (1) relatively high latency owing to their sensing mechanism, and
(2) lack of haptic feedback. We analyze the latency present in several current
multi-touch platforms, and we describe a new custom system that reduces latency
to an average of 30 ms while providing programmable haptic feed-back to the
user. The paper concludes with a description of ongoing and future work. Keywords: multi-touch, haptics, frustrated total internal reflection, music
performance, music composition, latency, DIY | |||
| The Electromagnetically Sustained Rhodes Piano | | BIBAK | PDF | 14-17 | |
| Greg Shear; Matthew Wright | |||
| The Electromagnetically Sustained Rhodes Piano is an augmentation of the
original instrument with additional control over the amplitude envelope of
individual notes. This includes slow attacks and infinite sustain while
preserving the familiar spectral qualities of this classic electromechanical
piano. These additional parameters are controlled with aftertouch on the
existing keyboard, extending standard piano technique. Two sustain methods were
investigated, driving the actuator first with a pure sine wave, and second with
the output signal of the sensor. A special isolation method effectively
decouples the sensors from the actuators and tames unruly feedback in the
high-gain signal path. Keywords: Rhodes, keyboard, electromagnetic, sustain, augmented instrument, feedback,
aftertouch | |||
| Gamelan Elektrika: An Electronic Balinese Gamelan | | BIBAK | PDF | 18-23 | |
| Laurel S. Pardue; Andrew Boch; Matt Boch; Christine Southworth; Alex Rigopulos | |||
| This paper describes the motivation and construction of Gamelan Elektrika, a
new electronic gamelan modeled after a Balinese Gong Kebyar. The first of its
kind, Elektrika consists of seven instruments acting as MIDI controllers
accompanied by traditional percussion and played by 11 or more performers
following Balinese performance practice. Three main percussive instrument
designs were executed using a combination of force sensitive resistors, piezos,
and capacitive sensing. While the instrument interfaces are designed to play
interchangeably with the original, the sound and travel possibilities they
enable are tremendous. MIDI enables a massive new sound palette with new scales
beyond the quirky traditional tuning and non-traditional sounds. It also allows
simplified transcription for an aurally taught tradition. Significantly, it
reduces the transportation challenges of a previously large and heavy ensemble,
creating opportunities for wider audiences to experience Gong Kebyar's
enchanting sound. True to the spirit of oneness in Balinese music, as one of
the first large all-MIDI ensembles, Elek Trika challenges performers to trust
silent instruments and develop an understanding of highly intricate and
interlocking music not through the sound of the individual, but through the
sound of the whole. Keywords: Bali, gamelan, musical instrument design, MIDI ensemble | |||
| Sonicstrument: A Musical Interface with Stereotypical Acoustic Transducers | | BIBAK | PDF | 24-27 | |
| Jeong-seob Lee; Woon Seung Yeo | |||
| This paper introduces Sonicstrument, a sound-based interface that traces the
user's hand motions. Sonicstrument utilizes stereotypical acoustic transducers
(i.e., a pair of earphones and a microphone) for transmission and reception of
acoustic signals whose frequencies are within the highest area of human hearing
range that can rarely be perceived by most people. Being simpler in structure
and easier to implement than typical ultrasonic motion detectors with special
transducers, this system is robust and offers precise results without
introducing any undesired sonic disturbance to users. We describe the design
and implementation of Sonicstrument, evaluate its performance, and present two
practical applications of the system in music and interactive performance. Keywords: Stereotypical transducers, audible sound, Doppler effect, hand-free
interface, musical instrument, interactive performance | |||
| Solar Sound Arts: Creating Instruments and Devices Powered by Photovoltaic Technologies | | BIBAK | PDF | 28-31 | |
| Scott Smallwood | |||
| This paper describes recent developments in the creation of sound-making
instrument and devices powered by photovoltaic (PV) technologies. With the rise
of more efficient PV products in diverse packages, the possibilities for
creating solar-powered musical instruments, sound installations, and
loudspeakers are becoming increasingly realizable. This paper surveys past and
recent developments in this area, including several projects by the author, and
demonstrates how the use of PV technologies can influence the creative process
in unique ways. In addition, this paper discusses how solar sound arts can
enhance the aesthetic direction taken by recent work in soundscape studies and
acoustic ecology. Finally, this paper will point towards future directions and
possibilities as PV technologies continue to evolve and improve in terms of
performance, and become more affordable. Keywords: Solar Sound Arts, Circuit Bending, Hardware Hacking, Human-Computer
Interface Design, Acoustic Ecology, Sound Art, Electroacoustics, Laptop
Orchestra, PV Technology | |||
| An Approach to Collaborative Music Composition | | BIBAK | PDF | 32-35 | |
| Niklas Klügel; Marc René Frieß; Georg Groh; Florian Echtler | |||
| This paper provides a discussion of how the electronic, solely IT based
composition and performance of electronic music can be supported in realtime
with a collaborative application on a tabletop interface, mediating between
single-user style music composition tools and co-located collaborative music
improvisation. After having elaborated on the theoretical backgrounds of
prerequisites of co-located collaborative tabletop applications as well as the
common paradigms in music composition/notation, we will review related work on
novel IT approaches to music composition and improvisation. Subsequently, we
will present our prototypical implementation and the results. Keywords: Tabletop Interface, Collaborative Music Composition, Creativity Support | |||
| A Reference Architecture and Score Representation for Popular Music Human-Computer Music Performance Systems | | BIBAK | PDF | 36-39 | |
| Nicolas E. Gold; Roger B. Dannenberg | |||
| Popular music (characterized by improvised instrumental parts, beat and
measure-level organization, and steady tempo) poses challenges for
human-computer music performance (HCMP). Pieces of music are typically
rearrangeable on-the-fly and involve a high degree of variation from ensemble
to ensemble, and even between rehearsal and performance. Computer systems
aiming to participate in such ensembles must therefore cope with a dynamic
high-level structure in addition to the more traditional problems of
beat-tracking, score-following, and machine improvisation. There are many
approaches to integrating the components required to implement dynamic
human-computer music performance systems. This paper presents a reference
architecture designed to allow the typical sub-components (e.g. beat-tracking,
tempo prediction, improvisation) to be integrated in a consistent way, allowing
them to be combined and/or compared systematically. In addition, the paper
presents a dynamic score representation particularly suited to the demands of
popular music performance by computer. Keywords: Live Performance, Software Design, Popular Music | |||
| V'OCT (Ritual): An Interactive Vocal Work for Bodycoder System and 8 Channel Spatialization | | BIBAK | PDF | 40-43 | |
| Mark A. Bokowiec | |||
| V'OCT(Ritual) is a work for solo vocalist/performer and Bodycoder System,
composed in residency at Dartington College of Arts (UK) Easter 2010. This
paper looks at the technical and compositional methodologies used in the
realization of the work, in particular, the choices made with regard to the
mapping of sensor elements to various spatialization functions. Kinaesonics
will be discussed in relation to the coding of real-time one-to-one mapping of
sound to gesture and its expression in terms of hardware and software design.
Four forms of expressivity arising out of interactive work with the Bodycoder
system will be identified. How sonic (electro-acoustic), programmed, gestural
(kinaesonic) and in terms of the V'Oct(Ritual) vocal expressivities are
constructed as pragmatic and tangible elements within the compositional
practice will be discussed and the subsequent importance of collaboration with
a performer will be exposed. Keywords: Bodycoder, Kinaesonics, Expressivity, Gestural Control, Interactive
Performance Mechanisms, Collaboration | |||
| First Person Shooters as Collaborative Multiprocess Instruments | | BIBAK | PDF | 44-47 | |
| Florent Berthaut; Haruhiro Katayose; Hironori Wakama; Naoyuki Totani; Yuichi Sato | |||
| First Person Shooters are among the most played computer video games. They
combine navigation, interaction and collaboration in 3D virtual environments
using simple input devices, i.e. mouse and keyboard. In this paper, we study
the possibilities brought by these games for musical interaction. We present
the Couacs, a collaborative multiprocess instrument which relies on interaction
techniques used in FPS together with new techniques adding the expressiveness
required for musical interaction. In particular, the Faders For All game mode
allows musicians to perform pattern-based electronic compositions. Keywords: the couacs, fps, first person shooters, collaborative, 3D interaction,
multiprocess instrument | |||
| Studying Interdependencies in Music Performance: An Interactive Tool | | BIBAK | PDF | 48-51 | |
| Tilo Hähnel; Axel Berndt | |||
| Musicians tend to model different performance parameters intuitively and
listeners seem to perceive them, to a certain degree, unconsciously. This is a
problem for the development of synthetic performance models, for they are built
upon detailed assumptions of several parameters like timing, loudness, and
duration-and of interdependencies as well. This paper describes an interactive
performance synthesis tool, which allows to analyse listener's preferences of
multiple performance features. Using the tool in a study of eighth notes
inégalité, a relationship between timing and loudness was found. Keywords: Synthetic Performance, Notes Inégales, Timing, Articulation,
Duration, Loudness, Dynamics | |||
| 1city1001vibrations: Development of a Interactive Sound Installation with Robotic Instrument Performance | | BIBAK | PDF | 52-55 | |
| Sinan Bökesoy; Patrick Adler | |||
| "1city1001vibrations" is a sound installation project of Sinan Bökesoy.
It does continuous analysis of live sounds with the microphones installed on
top of significant places at Bosphorus -- Istanbul. The transmitted sounds are
accompanied by an algorithmic composition derived from this content analysis
for controlling two Kuka industrial robot arms performing the percussions
installed around them while creating a metaphor through an intelligent
composition/performance system. This paper aims to focus on the programming
strategies taken for developing a musical instrument out of an industrial
robot. Keywords: Sound installation, robotic music, interactive systems | |||
| The Medium is the Message: Composing Instruments and Performing Mappings | | BIBAK | PDF | 56-59 | |
| Tim Murray-Browne; Di Mainstone; Nick Bryan-Kinns; Mark D. Plumbley | |||
| Many performers of novel musical instruments find it difficult to engage
audiences beyond those in the field. Previous research points to a failure to
balance complexity with usability, and a loss of transparency due to the
detachment of the controller and sound generator. The issue is often
exacerbated by an audience's lack of prior exposure to the instrument and its
workings. However, we argue that there is a conflict underlying many novel
musical instruments in that they are intended to be both a tool for creative
expression and a creative work of art in themselves, resulting in incompatible
requirements. By considering the instrument, the composition and the
performance together as a whole with careful consideration of the rate of
learning demanded of the audience, we propose that a lack of transparency can
become an asset rather than a hindrance. Our approach calls for not only
controller and sound generator to be designed in sympathy with each other, but
composition, performance and physical form too. Identifying three design
principles, we illustrate this approach with the Serendiptichord, a wearable
instrument for dancers created by the authors. Keywords: Performance, composed instrument, transparency, constraint | |||
| Clothesline as a Metaphor for a Musical Interface | | BIBAK | PDF | 60-63 | |
| Seunghun Kim; Luke K. Kim; Songhee Jeong; Woon Seung Yeo | |||
| In this paper, we discuss the use of the clothesline as a metaphor for
designing a musical interface called Airer Choir. This interactive installation
is based on the function of an ordinary object that is not a traditional
instrument, and hanging articles of clothing is literally the gesture to use
the interface. Based on this metaphor, a musical interface with high
transparency was designed. Using the metaphor, we explored the possibilities
for recognizing of input gestures and creating sonic events by mapping data to
sound. Thus, four different types of Airer Choir were developed. By classifying
the interfaces, we concluded that various musical expressions are possible by
using the same metaphor. Keywords: musical interface, metaphor, clothesline installation | |||
| EGGS in Action | | BIBAK | PDF | 64-67 | |
| Pietro Polotti; Maurizio Goina | |||
| In this paper, we discuss the results obtained by means of the EGGS
(Elementary Gestalts for Gesture Sonification) system in terms of artistic
realizations. EGGS was introduced in a previous edition of this conference. The
works presented include interactive installations in the form of public art and
interactive onstage performances. In all of the works, the EGGS principles of
simplicity based on the correspondence between elementary sonic and movement
units, and of organicity between sound and gesture are applied. Indeed, we
study both sound as a means for gesture representation and gesture as
embodiment of sound. These principles constitute our guidelines for the
investigation of the bidirectional relationship between sound and body
expression with various strategies involving both educated and non-educated
executors. Keywords: Gesture sonification, Interactive performance, Public art | |||
| A Reverberation Instrument Based on Perceptual Mapping | | BIBAK | PDF | 68-71 | |
| Berit Janssen | |||
| The present article describes a reverberation instrument which is based on
cognitive categorization of reverberating spaces. Different techniques for
artificial reverberation will be covered. A multidimensional scaling experiment
was conducted on impulse responses in order to determine how humans
acoustically perceive spatiality. This research seems to indicate that the
perceptual dimensions are related to early energy decay and timbral qualities.
These results are applied to a reverberation instrument based on delay lines.
It can be contended that such an instrument can be controlled more intuitively
than other delay line reverberation tools which often provide a confusing range
of parameters which have a physical rather than perceptual meaning. Keywords: Reverberation, perception, multidimensional scaling, mapping | |||
| Vibrotactile Feedback-Assisted Performance | | BIBAK | PDF | 72-75 | |
| Lauren Hayes | |||
| When performing digital music it is important to be able to acquire a
comparable level of sensitivity and control to what can be achieved with
acoustic instruments. By examining the links between sound and touch, new
compositional and performance strategies start to emerge for performers using
digital instruments. These involve technological implementations utilizing the
haptic information channels, offering insight into how our tacit knowledge of
the physical world can be introduced to the digital domain, enforcing the view
that sound is a 'species of touch'. This document illustrates reasons why
vibrotactile interfaces, which offer physical feedback to the performer, may be
viewed as an important approach in addressing the limitations of current
physical dynamic systems used to mediate the digital performer's control of
various sorts of musical information. It will examine one such method used for
performing in two different settings: with piano and live electronics, and
laptop alone, where in both cases, feedback is artificially introduced to the
performer's hands offering different information about what is occurring
musically. The successes of this heuristic research will be assessed, along
with a discussion of future directions of experimentation. Keywords: Vibrotactile feedback, human-computer interfaces, digital composition,
real-time performance, augmented instruments | |||
| Improving User-Interface of Interactive EC for Composition-Aid by means of Shopping Basket Procedure | | BIBAK | PDF | 76-79 | |
| Daichi Ando | |||
| The use of Interactive Evolutionary Computation (IEC) is suitable to the
development of art-creation aid system for beginners. This is because of
important features of IEC, like the ability of optimizing with ambiguous
evaluation measures, and not requiring special knowledge about art-creation.
With the popularity of Consumer Generated Media, many beginners in term of
art-creation are interested in creating their own original art works. Thus
developing of useful IEC system for musical creation is an urgent task.
However, user-assist functions for IEC proposed in past works decrease the
possibility of getting good unexpected results, which is an important feature
of art-creation with IEC. In this paper, The author proposes a new IEC
evaluation process named "Shopping Basket" procedure IEC. In the procedure, an
user-assist function called Similarity-Based Reasoning allows for natural
evaluation by the user. The function reduces user's burden without reducing the
possibility of unexpected results. The author performs an experiment where
subjects use the new interface to validate it. As a result of the experiment,
the author concludes the new interface is better to motivate users to compose
with IEC system than the old interface. Keywords: Interactive Evolutionary Computation, User-Interface, Composition Aid | |||
| BioRhythm: a Biologically-inspired Audio-Visual Installation | | BIBAK | PDF | 80-83 | |
| Ryan Mcgee; Yuan-Yi Fan; Reza Ali | |||
| BioRhythm is an interactive bio-feedback installation controlled by the
cardiovascular system. Data from a photoplethysmograph (PPG) sensor controls
sonification and visualization parameters in real-time. Biological signals are
obtained using the techniques of Resonance Theory in Hemodynamics and mapped to
audiovisual cues via the Five Element Philosophy. The result is a new media
interface utilizing sound synthesis and spatialization with advanced graphics
rendering. BioRhythm serves as an artistic exploration of the harmonic spectra
of pulse waves. Keywords: bio-feedback, bio-sensing, sonification, spatial audio, spatialization, FM
synthesis, Open Sound Control, visualization, parallel computing | |||
| Vibration, Volts and Sonic Art: A Practice and Theory of Electromechanical Sound | | BIBAK | PDF | 84-87 | |
| Jon Pigott | |||
| This paper explores the creative appropriation of loudspeakers and other
electromechanical devices in sonic arts practice. It is proposed that this is
an identifiable area of sonic art worthy of its own historical and theoretical
account. A case study of an original work by the author titled Infinite Spring
is presented within a context of works by other practitioners from the 1960's
to present day. The notion of the 'prepared speaker' is explored alongside
theories of media archeology, cracked media and acoustic ecology. Keywords: Electromechanical sonic art, kinetic sound art, prepared speakers, Infinite
Spring | |||
| Automatic Rhythmic Performance in Max/MSP: the kin.rhythmicator | | BIBAK | PDF | 88-91 | |
| George Sioros; Carlos Guedes | |||
| We introduce a novel algorithm for automatically generating rhythms in real
time in a certain meter. The generated rhythms are "generic" in the sense that
they are characteristic of each time signature without belonging to a specific
musical style. The algorithm is based on a stochastic model in which various
aspects and qualities of the generated rhythm can be controlled intuitively and
in real time. Such qualities are the density of the generated events per bar,
the amount of variation in generation, the amount of syncopation, the metrical
strength, and of course the meter itself. The kin.rhythmicator software
application was developed to implement this algorithm. During a performance
with the kin.rhythmicator the user can control all aspects of the performance
through descriptive and intuitive graphic controls. Keywords: automatic music generation, generative, stochastic, metric indispensability,
syncopation, Max/MSP, Max4Live | |||
| Towards a Voltage-Controlled Computer Control and Interaction Beyond an Embedded System | | BIBAK | PDF | 92-95 | |
| André Gonçalves | |||
| The importance of embedded devices as new devices to the field of
Voltage-Controlled Synthesizers is realized. Emphasis is directed towards
understanding the importance of such devices in Voltage-Controlled
Synthesizers. Introducing the Voltage-Controlled Computer as a new paradigm.
Specifications for hardware interfacing and programming techniques are
described based on real prototypes. Implementations and successful results are
reported. Keywords: Voltage-controlled synthesizer, embedded systems, voltage-controlled
computer, computer driven control voltage generation | |||
| Polyhymnia: An Automatic Piano Performance System with Statistical Modeling of Polyphonic Expression and Musical Symbol Interpretation | | BIBAK | PDF | 96-99 | |
| Tae Hun Kim; Satoru Fukayama; Takuya Nishimoto; Shigeki Sagayama | |||
| We developed an automatic piano performance system called Polyhymnia that is
able to generate expressive polyphonic piano performances with music scores so
that it can be used as a computer-based tool for an expressive performance. The
system automatically renders expressive piano music by means of automatic
musical symbol interpretation and statistical models of structure-expression
relations regarding polyphonic features of piano performance. Experimental
results indicate that the generated performances of various piano pieces with
diverse trained models had polyphonic expression and sounded expressively. In
addition, the models trained with different performance styles reflected the
styles observed in the training performances, and they were well
distinguishable by human listeners. Polyhymnia won the first prize in the
autonomous section of the Performance Rendering Contest for Computer Systems
(Rencon) 2010. Keywords: performance rendering, polyphonic expression, statistical modeling,
conditional random fields | |||
| Multitouch Interface for Audio Mixing | | BIBAK | PDF | 100-103 | |
| Juan P. Carrascal; Sergi Jordà | |||
| Audio mixing is the adjustment of relative volumes, panning and other
parameters corresponding to different sound sources, in order to create a
technically and aesthetically adequate sound sum. To do this, audio engineers
employ "panpots" and faders, the standard controls in audio mixers. The design
of such devices has remained practically unchanged for decades since their
introduction. At the time, no usability studies seem to have been conducted on
such devices, so one could question if they are really optimized for the task
they are meant for. This paper proposes a new set of controls that might be
used to simplify and/or improve the performance of audio mixing tasks, taking
into account the spatial characteristics of modern mixing technologies such as
surround and 3D audio and making use of multitouch interface technologies. A
preliminary usability test has shown promising results. Keywords: audio mixing, multitouch, control surface, touchscreen | |||
| Cognitive Architecture in Mobile Music Interactions | | BIBAK | PDF | 104-107 | |
| Nate Derbinsky; Georg Essl | |||
| This paper explores how a general cognitive architecture can pragmatically
facilitate the development and exploration of interactive music interfaces on a
mobile platform. To this end we integrated the Soar cognitive architecture into
the mobile music meta-environment urMus. We develop and demonstrate four
artificial agents which use diverse learning mechanisms within two mobile music
interfaces. We also include details of the computational performance of these
agents, evincing that the architecture can support real-time interactivity on
modern commodity hardware. Keywords: mobile music, machine learning, cognitive architecture | |||
| The Self-Supervising Machine | | BIBAK | PDF | 108-111 | |
| Benjamin D. Smith; Guy E. Garnett | |||
| Supervised machine learning enables complex many-to-many mappings and
control schemes needed in interactive performance systems. One of the
persistent problems in these applications is generating, identifying and
choosing input output pairings for training. This poses problems of scope
(limiting the realm of potential control inputs), effort (requiring significant
pre-performance training time), and cognitive load (forcing the performer to
learn and remember the control areas). We discuss the creation and
implementation of an automatic "supervisor," using unsupervised machine
learning algorithms to train a supervised neural network on the fly. This
hierarchical arrangement enables network training in real time based on the
musical or gestural control inputs employed in a performance, aiming at freeing
the performer to operate in a creative, intuitive realm, making the machine
control transparent and automatic. Three implementations of this self
supervised model driven by iPod, iPad, and acoustic violin are described. Keywords: NIME, machine learning, interactive computer music, machine listening,
improvisation, adaptive resonance theory | |||
| Beatscape, a Mixed Virtual-Physical Environment for Musical Ensembles | | BIBAK | PDF | 112-115 | |
| Aaron Albin; Sertan Sentürk; Akito Van Troyer; Brian Blosser; Oliver Jan; Gil Weinberg | |||
| A mixed media tool was created that promotes ensemble virtuosity through
tight coordination and interdepence in musical performance. Two different types
of performers interact with a virtual space using Wii remote and tangible
interfaces using the reacTIVision toolkit [11]. One group of performers uses a
tangible tabletop interface to place and move sound objects in a virtual
environment. The sound objects are represented by visual avatars and have audio
samples associated with them. A second set of performers make use of Wii
remotes to create triggering waves that can collide with those sound objects.
Sound is only produced upon collision of the waves with the sound objects. What
results is a performance in which users must negotiate through a physical and
virtual space and are positioned to work together to create musical pieces. Keywords: reacTIVision, processing, ensemble, mixed media, virtualization, tangible,
sample | |||
| MoodifierLive: Interactive and Collaborative Expressive Music Performance on Mobile Devices | | BIBAK | PDF | 116-119 | |
| Marco Fabiani; Gaël Dubus; Roberto Bresin | |||
| This paper presents MoodifierLive, a mobile phone application for
interactive control of rule-based automatic music performance. Five different
interaction modes are available, of which one allows for collaborative
performances with up to four participants, and two let the user control the
expressive performance using expressive hand gestures. Evaluations indicate
that the application is interesting, fun to use, and that the gesture modes,
especially the one based on data from free expressive gestures, allow for
performances whose emotional content matches that of the gesture that produced
them. Keywords: Expressive performance, gesture, collaborative performance, mobile phone | |||
| A Physically Based Sound Space for Procedural Agents | | BIBAK | PDF | 120-123 | |
| Benjamin Schroeder; Marc Ainger; Richard Parent | |||
| Physically based sound models have a "natural" setting in dimensional space:
a physical model has a shape and an extent and can be given a position relative
to other models. In our experimental system, we place procedurally animated
agents in a world of spatially situated physical models. The agents move in the
same space as the models and can interact with them, playing the models and
changing their configuration. The result is an ever-varying audiovisual
landscape. This can be seen as purely generative -- as a method for creating
algorithmic music -- or as a way to create instruments that change autonomously
as a human plays them. A third perspective is in between these two: agents and
humans can cooperate or compete to produce a gamelike or interactive
experience. Keywords: Physically based sound, behavioral animation, agents | |||
| Acquisition and Study of Blowing Pressure Profiles in Recorder Playing | | BIBAK | PDF | 124-127 | |
| Francisco García; Leny Vinceslas; Josep Tubau; Esteban Maestre | |||
| This paper presents a study of blowing pressure profiles acquired from
recorder playing. Blowing pressure signals are captured from real performance
by means of a low-intrusiveness acquisition system constructed around
commercial pressure sensors based on piezoelectric transducers. An alto
recorder was mechanically modified by a luthier to allow the measurement and
connection of sensors while respecting playability and intrusiveness. A
multi-modal database including aligned blowing pressure and sound signals is
constructed from real practice, covering the performance space by considering
different fundamental frequencies, dynamics, articulations and note durations.
Once signals were pre-processed and segmented, a set of temporal envelope
features were defined as a basis for studying and constructing a simplified
model of blowing pressure profiles in different performance contexts. Keywords: Instrumental gesture, recorder, wind instrument, blowing pressure,
multi-modal data | |||
| Experiences from Video-Controlled Sound Installations | | BIBAK | PDF | 128-131 | |
| Anders Friberg; Anna Källblad | |||
| This is an overview of the three installations Hoppsa Universum, CLOSE and
Flying Carpet. They were all designed as choreographed sound and music
installations controlled by the visitors movements. The perspective is from an
artistic goal/vision intention in combination with the technical challenges and
possibilities. All three installations were realized with video cameras in the
ceiling registering the users' position or movement. The video analysis was
then controlling different types of interactive software audio players.
Different aspects like narrativity, user control, and technical limitations are
discussed. Keywords: Gestures, dance, choreography, music installation, interactive music | |||
| ROOM #81 -- Agent-Based Instrument for Experiencing Architectural and Vocal Cues | | BIBAK | PDF | 132-135 | |
| Nicolas d'Alessandro; Roberto Calderon; Stefanie Müller | |||
| ROOM#81 is a digital art installation which explores how visitors can
interact with architectural and vocal cues to intimately collaborate. The main
space is split into two distinct areas separated by a soft wall, i.e. a large
piece of fabric tensed vertically. Movement within these spaces and interaction
with the soft wall is captured by various kinds of sensors. People's activity
is constantly used by an agent in order to predict their actions. Machine
learning is then achieved by such agent to incrementally modify the nature of
light in the room and some laryngeal aspects of synthesized vocal spasms. The
combination of people closely collaborating together, light changes and vocal
responses creates an intimate experience of touch, space and sound. Keywords: Installation, instrument, architecture, interactive fabric, motion, light,
voice synthesis, agent, collaboration | |||
| Kinetic Particles Synthesizer Using Multi-Touch Screen Interface of Mobile Devices | | BIBAK | PDF | 136-137 | |
| Yasuo Kuhara; Daiki Kobayashi | |||
| We developed a kinetic particles synthesizer for mobile devices having a
multi-touch screen such as a tablet PC and a smart phone. This synthesizer
generates music based on the kinetics of particles under a two-dimensional
physics engine. The particles move in the screen to synthesize sounds according
to their own physical properties, which are shape, size, mass, linear and
angular velocity, friction, restitution, etc. If a particle collides with
others, a percussive sound is generated. A player can play music by the simple
operation of touching or dragging on the screen of the device. Using a
three-axis acceleration sensor, a player can perform music by shuffling or
tilting the device. Each particle sounds just a simple tone. However, a large
amount of various particles play attractive music by aggregating their sounds.
This concept has been inspired by natural sounds made from an assembly of
simple components, for example, rustling leaves or falling rain. For a novice
who has no experience of playing a musical instrument, it is easy to learn how
to play instantly and enjoy performing music with intuitive operation. Our
system is used for musical instruments for interactive music entertainment. Keywords: Particle, Tablet PC, iPhone, iPod touch, iPad, Smart phone, Kinetics, Touch
screen, Physics engine | |||
| The Sound Flinger: A Haptic Spatializer | | BIBAK | PDF | 138-139 | |
| Chris Carlson; Eli Marschner; Hunter Mccurry | |||
| The Sound Flinger is an interactive sound spatialization instrument that
allows users to touch and move sound. Users record audio loops from an mp3
player or other external source. By manipulating four motorized faders, users
can control the locations of two virtual "sound objects" around a circle
corresponding to the perimeter of a quadraphonic sound field. Physical models
that simulate a spring-like interaction between each fader and the virtual
sound objects generate haptic and aural feedback, allowing users to literally
touch, wiggle, and fling sound around the room. Keywords: NIME, CCRMA, haptics, force feedback, sound spatialization, multi-channel
audio, linux audio, jack, Arduino, BeagleBoard, Pure Data (Pd), Satellite CCRMA | |||
| Daft Datum -- An Interface for Producing Music Through Foot-based Interaction | | BIBAK | PDF | 140-141 | |
| Ravi Kondapalli; Ben-Zhen Sung | |||
| Daft Datum is an autonomous new media artefact that takes input from
movement of the feet (i.e. tapping/stomping/stamping) on a wooden surface,
underneath which is a sensor sheet. The sensors in the sheet are mapped to
various sound samples and synthesized sounds. Attributes of the synthesized
sound, such as pitch and octave, can be controlled using the Nintendo Wii
Remote. It also facilitates switching between modes of sound and
recording/playing back a segment of audio. The result is music generated by
dancing on the device that is further modulated by a hand-held controller. Keywords: Daft Datum, Wii, Dance Pad, Feet, Controller, Bluetooth, Musical Interface,
Dance, Sensor Sheet | |||
| Strike on Stage: a Percussion and Media Performance | | BIBAK | PDF | 142-143 | |
| Charles Martin; Chi-Hsia Lai | |||
| This paper describes Strike on Stage, an interface and corresponding
audio-visual performance work developed and performed in 2010 by percussionists
and media artists ChiHsia Lai and Charles Martin. The concept of Strike on
Stage is to integrate computer visuals and sound into an improvised percussion
performance. A large projection surface is positioned directly behind the
performers, while a computer vision system tracks their movements. The setup
allows computer visualisation and sonification to be directly responsive and
unified with the performers' gestures. Keywords: percussion, media performance, computer vision | |||
| Gestural Embodiment of Environmental Sounds: an Experimental Study | | BIBAK | PDF | 144-148 | |
| Baptiste Caramiaux; Patrick Susini; Tommaso Bianco; Frédéric Bevilacqua; Olivier Houix; Norbert Schnell; Nicolas Misdariis | |||
| In this paper we present an experimental study concerning gestural
embodiment of environmental sounds in a listening context. The presented work
is part of a project aiming at modeling movement-sound relationships, with the
end goal of proposing novel approaches for designing musical instruments and
sounding objects. The experiment is based on sound stimuli corresponding to
"causal" and "non-causal" sounds. It is divided into a performance phase and an
interview. The experiment is designed to investigate possible correlation
between the perception of the "causality" of environmental sounds and different
gesture strategies for the sound embodiment. In analogy with the perception of
the sounds' causality, we propose to distinguish gestures that "mimic" a
sound's cause and gestures that "trace" a sound's morphology following temporal
sound characteristics. Results from the interviews show that, first, our causal
sounds database lead to consistent descriptions of the action at the origin of
the sound and participants mimic this action. Second, non-causal sounds lead to
inconsistent metaphoric descriptions of the sound and participants make
gestures following sound "contours". Quantitatively, the results show that
gesture variability is higher for causal sounds that noncausal sounds. Keywords: Embodiment, Environmental Sound Perception, Listening, Gesture Sound
Interaction | |||
| Listening to Your Brain: Implicit Interaction in Collaborative Music Performances | | BIBAK | PDF | 149-154 | |
| Sebastián Mealla; Aleksander Väaljamäae; Mathieu Bosi; Sergi Jordà | |||
| The use of physiological signals in Human Computer Interaction (HCI) is
becoming popular and widespread, mostly due to sensors miniaturization and
advances in real-time processing. However, most of the studies that use
physiology-based interaction focus on single-user paradigms, and its usage in
collaborative scenarios is still in its beginning. In this paper we explore how
interactive sonification of brain and heart signals, and its representation
through physical objects (physiopucks) in a tabletop interface may enhance
motivational and controlling aspects of music collaboration. A multimodal
system is presented, based on an electrophysiology sensor system and the
Reactable, a musical tabletop interface. Performance and motivation variables
were assessed in an experiment involving a test "Physio" group (N=22) and a
control "Placebo" group (N=10). Pairs of participants used two methods for
sound creation: implicit interaction through physiological signals, and
explicit interaction by means of gestural manipulation. The results showed that
pairs in the Physio Group declared less difficulty, higher confidence and more
symmetric control than the Placebo Group, where no real-time sonification was
provided as subjects were using pre-recorded physiological signal being unaware
of it. These results support the feasibility of introducing physiology-based
interaction in multimodal interfaces for collaborative music generation. Keywords: Music, Tabletops, Physiopucks, Physiological Computing, BCI, HCI,
Collaboration, CSCW, Multimodal Interfaces | |||
| Examining How Musicians Create Augmented Musical Instruments | | BIBAK | PDF | 155-160 | |
| Dan Newton; Mark T. Marshall | |||
| This paper examines the creation of augmented musical instruments by a
number of musicians. Equipped with a system called the Augmentalist, 10
musicians created new augmented instruments based on their traditional acoustic
or electric instruments. This paper discusses the ways in which the musicians
augmented their instruments, examines the similarities and differences between
the resulting instruments and presents a number of interesting findings
resulting from this process. Keywords: Augmented Instruments, Instrument Design, Digital Musical Instruments,
Performance | |||
| Tahakum: A Multi-Purpose Audio Control Framework | | BIBAK | PDF | 161-166 | |
| Zachary Seldess; Toshiro Yamada | |||
| We present "Tahakum", an open source, extensible collection of software
tools designed to enhance workflow on multichannel audio systems within complex
multi-functional research and development environments. Tahakum aims to provide
critical functionality required across a broad spectrum of audio systems usage
scenarios, while at the same time remaining sufficiently open as to easily
support modifications and extensions via 3rd party hardware and software.
Features provided in the framework include software for custom mixing/routing
and audio system preset automation, software for network message
routing/redirection and protocol conversion, and software for dynamic audio
asset management and control. Keywords: Audio Control Systems, Audio for VR, Max/MSP, Spatial Audio | |||
| A Framework for Coordination and Synchronization of Media | | BIBAK | PDF | 167-172 | |
| Dawen Liang; Guangyu Xia; Roger B. Dannenberg | |||
| Computer music systems that coordinate or interact with human musicians
exist in many forms. Often, coordination is at the level of gestures and
phrases without synchronization at the beat level (or perhaps the notion of
"beat" does not even exist). In music with beats, fine-grain synchronization
can be achieved by having humans adapt to the computer (e.g. following a click
track), or by computer accompaniment in which the computer follows a
predetermined score. We consider an alternative scenario in which improvisation
prevents traditional score following, but where synchronization is achieved at
the level of beats, measures, and cues. To explore this new type of
human-computer interaction, we have created new software abstractions for
synchronization and coordination of music and interfaces in different
modalities. We describe these new software structures, present examples, and
introduce the idea of music notation as an interactive musical interface rather
than a static document. Keywords: Real-time, Interactive, Music Display, Popular Music, Automatic
Accompaniment, Synchronization | |||
| Satellite CCRMA: A Musical Interaction and Sound Synthesis Platform | | BIBAK | PDF | 173-178 | |
| Edgar Berdahl; Wendy Ju | |||
| This paper describes a new Beagle Board-based platform for teaching and
practicing interaction design for musical applications. The migration from
desktop and laptop computer-based sound synthesis to a compact and integrated
control, computation and sound generation platform has enormous potential to
widen the range of computer music instruments and installations that can be
designed, and improves the portability, autonomy, extensibility and longevity
of designed systems. We describe the technical features of the Satellite CCRMA
platform and contrast it with personal computer-based systems used in the past
as well as emerging smart phone-based platforms. The advantages and tradeoffs
of the new platform are considered, and some project work is described. Keywords: NIME, Microcontrollers, Music Controllers, Pedagogy, Texas Instruments OMAP,
Beagle Board, Arduino, PD, Linux, open-source | |||
| Two Turntables and a Mobile Phone | | BIBAK | PDF | 179-184 | |
| Nicholas J. Bryan; Ge Wang | |||
| A novel method of digital scratching is presented as an alternative to
currently available digital hardware interfaces and time-coded vinyl (TCV).
Similar to TCV, the proposed method leverages existing analog turntables as a
physical interface to manipulate the playback of digital audio. To do so,
however, an accelerometer/gyroscope-equipped smart phone is firmly attached to
a modified record, placed on a turntable, and used to sense a performers
movement, resulting in a wireless sensing-based scratching method. The
accelerometer and gyroscope data is wirelessly transmitted to a computer to
manipulate the digital audio playback in real-time. The method provides the
benefit of digital audio and storage, requires minimal additional hardware,
accommodates familiar proprioceptive feedback, and allows a single interface to
control both digital and analog audio. In addition, the proposed method
provides numerous additional benefits including real-time graphical display,
multi-touch interaction, and untethered performance (e.g "air-scratching").
Such a method turns a vinyl record into an interactive surface and enhances
traditional scratching performance by affording new and creative musical
interactions. Informal testing show this approach to be viable, responsive, and
robust. Keywords: Digital scratching, mobile music, digital DJ, smartphone, turntable,
turntablism, record player, accelerometer, gyroscope, vinyl emulation software | |||
| MadPad: A Crowdsourcing System for Audiovisual Sampling | | BIBAK | PDF | 185-190 | |
| Nick Kruge; Ge Wang | |||
| MadPad is a networked audiovisual sample station for mobile devices. Twelve
short video clips are loaded onto the screen in a grid and playback is
triggered by tapping anywhere on the clip. This is similar to tapping the pads
of an audio sample station, but extends that interaction to add visual
sampling. Clips can be shot on-the-fly with a camera-enabled mobile device and
loaded into the player instantly, giving the performer an ability to quickly
transform his or her surroundings into a sample-based, audiovisual instrument.
Samples can also be sourced from an online community in which users can post or
download content. The recent ubiquity of multitouch mobile devices and advances
in pervasive computing have made this system possible, providing for a vast
amount of content only limited by the imagination of the performer and the
community. This paper presents the core features of MadPad and the design
explorations that inspired them. Keywords: mobile music, networked music, social music, audiovisual, sampling,
user-generated content, crowdsourcing, sample station, iPad, iPhone | |||
| The Visual in Mobile Music Performance | | BIBAK | PDF | 191-196 | |
| Patrick O. Keefe; Georg Essl | |||
| Visual information integration in mobile music performance is an area that
has not been thoroughly explored and current applications are often
individually designed. From camera input to flexible output rendering, we
discuss visual performance support in the context of urMus, a meta-environment
for mobile interaction and performance development. The use of cameras, a set
of image primitives, interactive visual content, projectors, and camera flashes
can lead to visually intriguing performance possibilities. Keywords: Mobile performance, visual interaction, camera phone, mobile collaboration | |||
| Designing for the iPad: Magic Fiddle | | BIBAK | PDF | 197-202 | |
| Ge Wang; Jieun Oh; Tom Lieber | |||
| This paper describes the origin, design, and implementation of Smule's Magic
Fiddle, an expressive musical instrument for the iPad. Magic Fiddle takes
advantage of the physical aspects of the device to integrate game-like and
pedagogical elements. We describe the origin of Magic Fiddle, chronicle its
design process, discuss its integrated music education system, and evaluate the
overall experience. Keywords: Magic Fiddle, iPad, physical interaction design, experiential design, music
education | |||
| MobileMuse: Integral Music Control Goes Mobile | | BIBAK | PDF | 203-206 | |
| Benjamin Knapp; Brennon Bortz | |||
| This paper describes a new interface for mobile music creation, the
MobileMuse, that introduces the capability of using physiological indicators of
emotion as a new mode of interaction. Combining both kinematic and
physiological measurement in a mobile environment creates the possibility of
integral music control-the use of both gesture and emotion to control sound
creation-where it has never been possible before. This paper will review the
concept of integral music control and describe the motivation for creating the
MobileMuse, its design and future possibilities. Keywords: Affective computing, physiological signal measurement, mobile music
performance | |||
| Tangible Performance Management of Grid-based Laptop Orchestras | | BIBAK | PDF | 207-210 | |
| Stephen D. Beck; Chris Branton; Sharath Maddineni | |||
| Laptop Orchestras (LOs) have recently become a very popular mode of musical
expression. They engage groups of performers to use ordinary laptop computers
as instruments and sound sources in the performance of specially created music
software. Perhaps the biggest challenge for LOs is the distribution, management
and control of software across heterogeneous collections of networked
computers. Software must be stored and distributed from a central repository,
but launched on individual laptops immediately before performance. The GRENDL
project leverages proven grid computing frameworks and approaches the Laptop
Orchestra as a distributed computing platform for interactive computer music.
This allows us to readily distribute software to each laptop in the orchestra
depending on the laptop's internal configuration, its role in the composition,
and the player assigned to that computer. Using the SAGA framework, GRENDL is
able to distribute software and manage system and application environments for
each composition. Our latest version includes tangible control of the GRENDL
environment for a more natural and familiar user experience. Keywords: laptop orchestra, tangible interaction, grid computing | |||
| Audio Arduino -- an ALSA (Advanced Linux Sound Architecture) Audio Driver for FTDI-based Arduinos | | BIBAK | PDF | 211-216 | |
| Smilen Dimitrov; Stefania Serafin | |||
| A contemporary PC user, typically expects a sound card to be a piece of
hardware, that: can be manipulated by 'audio' software (most typically
exemplified by 'media players'); and allows interfacing of the PC to audio
reproduction and/or recording equipment. As such, a 'sound card' can be
considered to be a system, that encompasses design decisions on both hardware
and software levels that also demand a certain understanding of the
architecture of the target PC operating system. This project outlines how an
Arduino Duemillanove board (containing a USB interface chip, manufactured by
Future Technology Devices International Ltd [FTDI] company) can be demonstrated
to behave as a full-duplex, mono, 8-bit 44.1 kHz soundcard, through an
implementation of: a PC audio driver for ALSA (Advanced Linux Sound
Architecture); a matching program for the Arduino's ATmega microcontroller and
nothing more than headphones (and a couple of capacitors). The main
contribution of this paper is to bring a holistic aspect to the discussion on
the topic of implementation of soundcards also by referring to open-source
driver, microcontroller code and test methods; and outline a complete
implementation of an open yet functional soundcard system. Keywords: Sound card, Arduino, audio, driver, ALSA, Linux | |||
| Musical Control of a Pipe Based on Acoustic Resonance | | BIBAK | PDF | 217-219 | |
| Seunghun Kim; Woon Seung Yeo | |||
| In this paper, we introduce a pipe interface that recognizes touch on tone
holes by the resonances in the pipe instead of a touch sensor. This work was
based on the acoustic principles of woodwind instruments without complex
sensors and electronic circuits to develop a simple and durable interface. The
measured signals were analyzed to show that different fingerings generate
various sounds. The audible resonance signal in the pipe interface can be used
as a sonic event for musical expression by itself and also as an input
parameter for mapping different sounds. Keywords: resonance, mapping, pipe | |||
| Play Fluency in Music Improvisation Games for Novices | | BIBAK | PDF | 220-223 | |
| Anne-Marie S. Hansen; Hans J. Anderson; Pirkko Raudaskoski | |||
| In this paper a collaborative music game for two pen tablets is studied in
order to see how two people with no professional music background negotiated
musical improvisation. In an initial study of what it is that constitutes play
fluency in improvisation, a music game has been designed and evaluated through
video analysis: A qualitative view of mutual action describes the social
context of music improvisation: how two people with speech, laughter, gestures,
postures and pauses negotiate individual and joint action. The objective behind
the design of the game application was to support players in some aspects of
their mutual play. Results show that even though players activated additional
sound feedback as a result of their mutual play, players also engaged in forms
of mutual play that the game engine did not account for. These ways of mutual
play are described further along with some suggestions for how to direct future
designs of collaborative music improvisation games towards ways of mutual play. Keywords: Collaborative interfaces, improvisation, interactive music games, social
interaction, play, novice | |||
| The Bass Sleeve: A Real-time Multimedia Gestural Controller for Augmented Electric Bass Performance | | BIBAK | PDF | 224-227 | |
| Izzi Ramkissoon | |||
| The Bass Sleeve uses an Arduino board with a combination of buttons,
switches, flex sensors, force sensing resistors, and an accelerometer to map
the ancillary movements of a performer to sampling, real-time audio and video
processing including pitch shifting, delay, low pass filtering, and onscreen
video movement. The device was created to augment the existing functions of the
electric bass and explore the use of ancillary gestures to control the laptop
in a live performance. In this research it was found that incorporating
ancillary gestures into a live performance could be useful when controlling the
parameters of audio processing, sound synthesis and video manipulation. These
ancillary motions can be a practical solution to gestural multitasking allowing
independent control of computer music parameters while performing with the
electric bass. The process of performing with the Bass Sleeve resulted in a
greater amount of laptop control, an increase in the amount of expressiveness
using the electric bass in combination with the laptop, and an improvement in
the interactivity on both the electric bass and laptop during a live
performance. The design uses various gesture-to-sound mapping strategies to
accomplish a compositional task during an electro acoustic multimedia musical
performance piece. Keywords: Interactive Music, Interactive Performance Systems, Gesture Controllers,
Augmented Instruments, Electric Bass, Video Tracking | |||
| The KarmetiK NotomotoN: A New Breed of Musical Robot for Teaching and Performance | | BIBAK | PDF | 228-231 | |
| Ajay Kapur; Michael Darling; Jim Murphy; Jordan Hochenbaum; Dimitri Diakopoulos; Trimpin Trimpin | |||
| This paper describes the KarmetiK NotomotoN, a new musical robotic system
for performance and education. A long time goal of the authors has been to
provide users with plug-and-play, highly expressive musical robot system with a
high degree of portability. This paper describes the technical details of the
NotomotoN, and discusses its use in performance and educational scenarios.
Detailed tests performed to optimize technical aspects of the NotomotoN are
described to highlight usability and performance specifications for electronic
musicians and educators. Keywords: Musical Robotics, Music Technology, Robotic Performance, NotomotoN, KarmetiK | |||
| The Manipuller: Strings Manipulation and Multi-Dimensional Force Sensing | | BIBAK | PDF | 232-235 | |
| Adrián Barenca; Giuseppe Torre | |||
| The Manipuller is a novel Gestural Controller based on strings manipulation
and multi-dimensional force sensing technology. This paper describes its
motivation, design and operational principles along with some of its musical
applications. Finally the results of a preliminary usability test are presented
and discussed. Keywords: Gestural Controller, Strings, Manipulation, Force Sensing. | |||
| Mapping Objects with the Surface Editor | | BIBAK | PDF | 236-239 | |
| Alain Crevoisier; Cécile Picard-Limpens | |||
| The Surface Editor is a software tool for creating control interfaces and
mapping input actions to OSC or MIDI actions very easily and intuitively.
Originally conceived to be used with a tactile interface, the Surface Editor
has been extended to support the creation of graspable interfaces as well. This
paper presents a new framework for the generic mapping of user actions with
graspable objects on a surface. We also present a system for detecting touch on
thin objects, allowing for extended interactive possibilities. The Surface
Editor is not limited to a particular tracking system though, and the generic
mapping approach for objects can have a broader use with various input
interfaces supporting touch and/or objects. Keywords: NIME, mapping, interaction, user-defined interfaces, tangibles, graspable
interfaces | |||
| Adding Z-Depth and Pressure Expressivity to Tangible Tabletop Surfaces | | BIBAK | PDF | 240-243 | |
| Jordan Hochenbaum; Ajay Kapur | |||
| This paper presents the SmartFiducial, a wireless tangible object that
facilitates additional modes of expressivity for vision-based tabletop
surfaces. Using infrared proximity sensing and resistive based force-sensors,
the SmartFiducial affords users unique, and highly gestural inputs.
Furthermore, the SmartFiducial incorporates additional customizable pushbutton
switches. Using XBee radio frequency (RF) wireless transmission, the
SmartFiducial establishes bipolar communication with a host computer. This
paper describes the design and implementation of the SmartFiducial, as well as
an exploratory use in a musical context. Keywords: Fiducial, Tangible Interface, Multi-touch, Sensors, Gesture, Haptics,
Bricktable, Proximity Sensing | |||
| Hex Player -- A Virtual Musical Controller | | BIBAK | PDF | 244-247 | |
| Andrew J. Milne; Anna Xambó; Robin Laney; David B. Sharp; Anthony Prechtl; Simon Holland | |||
| In this paper, we describe a playable musical interface for tablets and
multi-touch tables. The interface is a generalized keyboard, inspired by the
Thummer, and consists of an array of virtual buttons. On a generalized
keyboard, any given interval always has the same shape (and therefore
fingering); furthermore, the fingering is consistent over a broad range of
tunings. Compared to a physical generalized keyboard, a virtual version has
some advantages-notably, that the spatial location of the buttons can be
transformed by shears and rotations, and their colouring can be changed to
reflect their musical function in different scales. We exploit these
flexibilities to facilitate the playing not just of conventional Western scales
but also a wide variety of microtonal generalized diatonic scales known as
moment of symmetry, or well-formed, scales. A user can choose such a scale, and
the buttons are automatically arranged so their spatial height corresponds to
their pitch, and buttons an octave apart are always vertically above each
other. Furthermore, the most numerous scale steps run along rows, while buttons
within the scale are light-coloured, and those outside are dark or removed.
These features can aid beginners; for example, the chosen scale might be the
diatonic, in which case the piano's familiar white and black colouring of the
seven diatonic and five chromatic notes is used, but only one scale fingering
need ever be learned (unlike a piano where every key needs a different
fingering). Alternatively, it can assist advanced composers and musicians
seeking to explore the universe of unfamiliar microtonal scales. Keywords: generalized keyboard, isomorphic layout, multi-touch surface, tablet,
musical interface design, iPad, microtonality | |||
| Rhythm Performance from a Spectral Point of View | | BIBAK | PDF | 248-251 | |
| Carl H. Waadeland | |||
| Basic to both performance and experience of rhythm in music is a connection
between musical rhythm and patterns of body movements. A main focus in this
study is to investigate possible relations between movement categories and
rhythmic expression. An analytical approach to this task is to regard a
musician's various ways of moving when playing an instrument as an expression
of timbral aspects of rhythm, and to apply FFT to empirical data of the
musician's movements in order to detect spectral components that are
characteristic of the performance. In the present paper we exemplify this
approach by reporting some findings from empirical investigations of jazz
drummers' movements in performances of swing groove. In particular we show that
performances of the groove in three different tempi (60, 120, 300 bpm) yield
quite different spectral characteristics of the movements. This spectral
approach to rhythm performance might suggest alternative ways of constructing
syntheses and models of rhythm production, and could also be of interest for
the construction of interfaces based on detecting spectral properties of body
movements. Keywords: Rhythm performance, movement, gesture, spectral analysis, swing | |||
| Nuvolet: 3D Gesture-driven Collaborative Audio Mosaicing | | BIBAK | PDF | 252-255 | |
| Josep M. Comajuncosas; Alex Barrachina; John O'Connell; Enric Guaus | |||
| This research presents a 3D gestural interface for collaborative
concatenative sound synthesis and audio mosaicing. Our goal is to improve the
communication between the audience and performers by means of an enhanced
correlation between gestures and musical outcome. Nuvolet consists of a 3D
motion controller coupled to a concatenative synthesis engine. The interface
detects and tracks the performers hands in four dimensions (x,y,z,t) and allows
them to concurrently explore two or three-dimensional sound cloud
representations of the units from the sound corpus, as well as to perform
collaborative target-based audio mosaicing. Nuvolet is included in the Esmuc
Laptop Orchestra catalog for forthcoming performances. Keywords: concatenative synthesis, audio mosaicing, open-air interface, gestural
controller, musical instrument, 3D | |||
| Effective and Expressive Movements in a French-Canadian fiddler's Performance | | BIBAK | PDF | 256-259 | |
| Erwin Schoonderwaldt; Alexander R. Jensenius | |||
| We report on a performance study of a French-Canadian fiddler. The fiddling
tradition forms an interesting contrast to classical violin performance in
several ways. Distinguishing features include special elements in the bowing
technique and the presence of an accompanying foot clogging pattern. These two
characteristics are described, visualized and analyzed using video and motion
capture recordings as source material. Keywords: fiddler, violin, French-Canadian, bowing, feet, clogging, motion capture,
video, motiongram, kinematics, sonification | |||
| Flowspace -- A Hybrid Ecosystem | | BIBAK | PDF | 260-263 | |
| Daniel Bisig; Jan C. Schacher; Martin Neukom | |||
| In this paper an audio-visual installation is discussed, which combines
interactive, immersive and generative elements. After introducing some of the
challenges in the field of Generative Art and placing the work within its
research context, conceptual reflections are made about the spatial,
behavioural, perceptual and social issues that are raised within the entire
installation. A discussion about the artistic content follows, focussing on the
scenography and on working with flocking algorithms in general, before
addressing three specific pieces realised for the exhibition. Next the
technical implementation for both hard- and software are detailed before the
idea of a hybrid ecosystem gets discussed and further developments outlined. Keywords: Generative Art, Interactive Environment, Immersive Installation, Swarm
Simulation, Hybrid Ecosystem | |||
| Implementing a Finite Difference-Based Real-time Sound Synthesizer using GPUs | | BIBAK | PDF | 264-267 | |
| Marc Sosnick; William Hsu | |||
| In this paper, we describe an implementation of a real-time sound
synthesizer using Finite Difference-based simulation of a two-dimensional
membrane. Finite Difference (FD) methods can be the basis for physics-based
music instrument models that generate realistic audio output. However, such
methods are compute-intensive; large simulations cannot run in real time on
current CPUs. Many current systems now include powerful Graphics Processing
Units (GPUs), which are a good fit for FD methods. We demonstrate that it is
possible to use this method to create a usable real-time audio synthesizer. Keywords: Finite Difference, GPU, CUDA, Synthesis | |||
| An Artificial Intelligence Architecture for Musical Expressiveness that Learns by Imitation | | BIBAK | PDF | 268-271 | |
| Axel Tidemann | |||
| Interacting with musical avatars have been increasingly popular over the
years, with the introduction of games like Guitar Hero and Rock Band. These
games provide MIDIequipped controllers that look like their real-world
counterparts (e.g. MIDI guitar, MIDI drumkit) that the users play to control
their designated avatar in the game. The performance of the user is measured
against a score that needs to be followed. However, the avatar does not move in
response to how the user plays, it follows some predefined movement pattern. If
the user plays badly, the game ends with the avatar ending the performance
(i.e. throwing the guitar on the floor). The gaming experience would increase
if the avatar would move in accordance with user input. This paper presents an
architecture that couples musical input with body movement. Using imitation
learning, a simulated human robot learns to play the drums like human drummers
do, both visually and auditory. Learning data is recorded using MIDI and motion
tracking. The system uses an artificial intelligence approach to implement
imitation learning, employing artificial neural networks. Keywords: Modeling Human Behaviour, Drumming, Artificial Intelligence | |||
| TweetDreams: Making Music with the Audience and the World using Real-time Twitter Data | | BIBAK | PDF | 272-275 | |
| Luke Dahl; Jorge Herrera; Carr Wilkerson | |||
| TweetDreams is an instrument and musical composition which creates real-time
sonification and visualization of tweets. Tweet data containing specified
search terms is retrieved from Twitter and used to build networks of associated
tweets. These networks govern the creation of melodies associated with each
tweet and are displayed graphically. Audience members participate in the piece
by tweeting, and their tweets are given special musical and visual prominence. Keywords: Twitter, audience participation, sonification, data visualization, text
processing, interaction, multi-user instrument | |||
| JunctionBox: A Toolkit for Creating Multi-touch Sound Control Interfaces | | BIBAK | PDF | 276-279 | |
| Lawrence Fyfe; Adam Tindale; Sheelagh Carpendale | |||
| JunctionBox is a new software toolkit for creating multitouch interfaces for
controlling sound and music. More specifically, the toolkit has special
features which make it easy to create TUIO-based touch interfaces for
controlling sound engines via Open Sound Control. Programmers using the toolkit
have a great deal of freedom to create highly customized interfaces that work
on a variety of hardware. Keywords: Multi-touch, Open Sound Control, Toolkit, TUIO | |||
| Beyond Evaluation: Linking Practice and Theory in New Musical Interface Design | | BIBAK | PDF | 280-283 | |
| Andrew Johnston | |||
| This paper presents an approach to practice-based research in new musical
instrument design. At a high level, the process involves drawing on relevant
theories and aesthetic approaches to design new instruments, attempting to
identify relevant applied design criteria, and then examining the experiences
of performers who use the instruments with particular reference to these
criteria. Outcomes of this process include new instruments, theories relating
to musician-instrument interaction and a set of design criteria informed by
practice and research. Keywords: practice-based research, evaluation, Human-Computer Interaction, research
methods, user studies | |||
| Intuitive Real-Time Control of Spectral Model Synthesis | | BIBAK | PDF | 284-287 | |
| Phillip Popp; Matthew Wright | |||
| Several methods exist for manipulating spectral models either by applying
transformations via higher level features or by providing in-depth offline
editing capabilities. In contrast, our system aims for direct, full, intuitive,
real-time control without exposing any spectral model features to the user. The
system extends upon previous machine learning work in gesture-synthesis mapping
by applying it to spectral models; these are a unique and interesting use case
in that they are capable of reproducing real world recordings, due to their
relatively high data rate and complex, intertwined and synergetic structure. To
achieve a direct and intuitive control of a spectral model, a method to extract
an individualized mapping between Wacom Pen parameters and Spectral Model
Synthesis frames is described and implemented as a standalone application. The
method works by capturing tablet parameters as the user pantomimes to
synthesized spectral model. A transformation from Wacom Pen parameters to
gestures is obtained by extracting features from the pen and then transforming
those features using Principal Component Analysis. Then a linear model maps
between gestures and higher level features of the spectral model frames while a
k-nearest neighbor algorithm maps between gestures and normalized spectral
model frames. Keywords: Spectral Model Synthesis, Gesture Recognition, Synthesis Control, Wacom
Tablet, Machine Learning | |||
| BeatJockey: A New Tool for Enhancing DJ Skills | | BIBAK | PDF | 288-291 | |
| Pablo Molina; Martín Haro; Sergi Jordà | |||
| We present BeatJockey, a prototype interface which makes use of Audio
Mosaicing (AM), beat-tracking and machine learning techniques, for supporting
Diskjockeys (DJs) by proposing them new ways of interaction with the songs on
the DJ's playlist. This prototype introduces a new paradigm to DJing in which
the user has the capability to mix songs interacting with beat-units that
accompany the DJ's mix. For this type of interaction, the system suggests song
slices taken from songs selected from a playlist, which could go well with the
beats of whatever master song is being played. In addition the system allows
the synchronization of multiple songs, thus permitting flexible, coherent and
rapid progressions in the DJ's mix. BeatJockey uses the Reactable, a musical
tangible user interface (TUI), and it has been designed to be used by all DJs
regardless of their level of expertise, as the system helps the novice while
bringing new creative opportunities to the expert. Keywords: DJ, music information retrieval, audio mosaicing, percussion, turntable,
beat-mash, interactive music interfaces, realtime, tabletop interaction,
reactable | |||
| Traces -- Body, Motion and Sound | | BIBAK | PDF | 292-295 | |
| Jan C. Schacher; Angela Stoecklin | |||
| In this paper the relationship between body, motion and sound is addressed.
The comparison with traditional instruments and dance is shown with regards to
basic types of motion. The difference between gesture and movement is outlined
and some of the models used in dance for structuring motion sequences are
described. In order to identify expressive aspects of motion sequences a test
scenario is devised. After the description of the methods and tools used in a
series of measurements, two types of data-display are shown and the applied in
the interpretation. One salient feature is recognized and put into perspective
with regards to movement and gestalt perception. Finally the merits of the
technical means that were applied are compared and a model-based approach to
motion-sound mapping is proposed. Keywords: Interactive Dance, Motion and Gesture, Sonification, Motion Perception,
Mapping | |||
| MoodMixer: EEG-based Collaborative Sonification | | BIBAK | PDF | 296-299 | |
| Grace Leslie; Tim Mullen | |||
| MoodMixer is an interactive installation in which participants
collaboratively navigate a two-dimensional music space by manipulating their
cognitive state and conveying this state via wearable Electroencephalography
(EEG) technology. The participants can choose to actively manipulate or
passively convey their cognitive state depending on their desired approach and
experience level. A four-channel electronic music mixture continuously conveys
the participants' expressed cognitive states while a colored visualization of
their locations on a two-dimensional projection of cognitive state attributes
aids their navigation through the space. MoodMixer is a collaborative
experience that incorporates aspects of both passive and active EEG
sonification and performance art. We discuss the technical design of the
installation and place its collaborative sonification aesthetic design within
the context of existing EEG-based music and art. Keywords: EEG, BCMI, collaboration, sonification, visualization | |||
| OSC Implementation and Evaluation of the Xsens MVN Suit | | BIBA | PDF | 300-303 | |
| Ståle A. Skogstad; Yago de Quay; Alexander R. Jensenius | |||
| The paper presents research about implementing a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. Three different approaches for streaming real time and prerecorded motion capture data with Open Sound Control have been implemented. Furthermore, we present technical performance details and our experience with the motion capture system in realistic practice. | |||
| The Effect of Visualizing Audio Targets in a Musical Listening and Performance Task | | BIBAK | PDF | 304-307 | |
| Lonce Wyse; Norikazu Mitani; Suranga Nanayakkara | |||
| The goal of our research is to find ways of supporting and encouraging
musical behavior by non-musicians in shared public performance environments.
Previous studies indicated simultaneous music listening and performance is
difficult for non-musicians, and that visual support for the task might be
helpful. This paper presents results from a preliminary user study conducted to
evaluate the effect of visual feedback on a musical tracking task. Participants
generated a musical signal by manipulating a hand-held device with two
dimensions of control over two parameters, pitch and density of note events,
and were given the task of following a target pattern as closely as possible.
The target pattern was a machine-generated musical signal comprising of
variation over the same two parameters. Visual feedback provided participants
with information about the control parameters of the musical signal generated
by the machine. We measured the task performance under different visual
feedback strategies. Results show that single parameter visualizations tend to
improve the tracking performance with respect to the visualized parameter, but
not the non-visualized parameter. Visualizing two independent parameters
simultaneously decreases performance in both dimensions. Keywords: Mobile phone, Interactive music performance, Listening, Group music play,
Visual support | |||
| Composability for Musical Gesture Signal Processing using new OSC-based Object and Functional Programming Extensions to Max/MSP | | BIBAK | PDF | 308-311 | |
| Adrian Freed; John MacCallum; Andrew Schmeder | |||
| An effective programming style for gesture signal processing is described
using a new library that brings efficient run-time polymorphism, functional and
instance-based object-oriented programming to Max/MSP. By introducing better
support for generic programming and composability Max/MSP becomes a more
productive environment for managing the growing scale and complexity of gesture
sensing systems for musical instruments and interactive installations. Keywords: Composability, object, Open Sound Control, Gesture Signal Processing,
Max/MSP, FunctionalProgramming, Object-Oriented Programming, Delegation | |||
| SoundSaber -- A Motion Capture Instrument | | BIBA | PDF | 312-315 | |
| Kristian Nymoen; Ståle A. Skogstad; Alexander R. Jensenius | |||
| The paper presents the SoundSaber a musical instrument based on motion capture technology. We present technical details of the instrument and discuss the design development process. The SoundSaber may be used as an example of how high-fidelity motion capture equipment can be used for prototyping musical instruments, and we illustrate this with an example of a low-cost implementation of our motion capture instrument. | |||
| A Modulation Matrix for Complex Parameter Sets | | BIBAK | PDF | 316-319 | |
| Øyvind Brandtsegg; Sigurd Saue; Thom Johansen | |||
| The article describes a flexible mapping technique realized as a
many-to-many dynamic mapping matrix. Digital sound generation is typically
controlled by a large number of parameters and efficient and flexible mapping
is necessary to provide expressive control over the instrument. The proposed
modulation matrix technique may be seen as a generic and self-modifying mapping
mechanism integrated in a dynamic interpolation scheme. It is implemented
efficiently by taking advantage of its inherent sparse matrix structure. The
modulation matrix is used within the Hadron Particle Synthesizer, a complex
granular module with 200 synthesis parameters and a simplified performance
control structure with 4 expression parameters. Keywords: Mapping, granular synthesis, modulation, live performance | |||
| Sound Low Fun | | BIBAK | PDF | 320-321 | |
| Yu-Chung Tseng; Che-Wei Liu; Tzu-Heng Chi; Hui-Yu Wang | |||
| Sound Low Fun, a large sphere, is an interactive sound installation. The
installation could produce low-frequency sound (Low Sound) to make people feel
relax, to have "Fun" effect ("Fun" also pronounced close to Chinese word
"放", which also means relax. This is our main concern and fundamental
idea of the project. Our work present a sense of technology, and then we follow
the structure by "C60" to divide into 32 blocks; Regarding the part of internal
circuit design, we employed the force sensor and ADXL335 three-axis
accelerometer connect with Arduino I/O and The Mux (Multiplexer) Shield, then
it can produce different music with different lighting effects through Max/MSP
programming. As music was concerned, we make use a type of meditative
long-sustained low-frequency sound, accompanied by some transparency
high-frequency sounds as sphere was shaked. When user presses, hugs and pushes
the sphere, it trigger the soft low sound and lighting effects generated, as a
result, user relieve his/her pressure eventually. Keywords: Large-scale, interactive installation, low-frequency sounds, stress relief,
Max/MSP computer music programming, Arduino | |||
| Autonomous New Media Artefacts (AutoNMA ) | | BIBAK | PDF | 322-323 | |
| Edgar Berdahl; Chris Chafe | |||
| The purpose of this brief paper is to revisit the question of longevity in
present experimental practice and coin the term autonomous new media artefacts
(AutoNMA), which are complete and independent of external computer systems, so
they can be operable for a longer period of time and can be demonstrated at a
moment's notice. We argue that platforms for prototyping should promote the
creation of AutoNMA to make extant the devices which will be a part of the
future history of new media. Keywords: autonomous, standalone, Satellite CCRMA, Arduino | |||
| Creating Musical Expression using Kinect | | BIBAK | PDF | 324-325 | |
| Min-Joon Yoo; Jin-Wook Beak; In-Kwon Lee | |||
| Recently, Microsoft introduced a game interface called Kinect for the Xbox
360 video game platform. This interface enables users to control and interact
with the game console without the need to touch a controller. It largely
increases the users' degree of freedom to express their emotion. In this paper,
we first describe the system we developed to use this interface for sound
generation and controlling musical expression. The skeleton data are extracted
from users' motions and the data are translated to pre-defined MIDI data. We
then use the MIDI data to control several applications. To allow the
translation between the data, we implemented a simple Kinect-to-MIDI data
convertor, which is introduced in this paper. We describe two applications to
make music with Kinect: we first generate sound with Max/MSP, and then control
the adlib with our own adlib generating system by the body movements of the
users. Keywords: Kinect, gaming interface, sound generation, adlib generation | |||
| Making Grains Tangible: Microtouch for Microsound | | BIBAK | PDF | 326-328 | |
| Staas de Jong | |||
| This paper proposes a new research direction for the large family of
instrumental musical interfaces where sound is generated using digital granular
synthesis, and where interaction and control involve the (fine) operation of
stiff, flat contact surfaces. First, within a historical context, a general
absence of, and clear need for, tangible output that is dynamically
instantiated by the grain-generating process itself is identified. Second, to
fill this gap, a concrete general approach is proposed based on the careful
construction of non-vibratory and vibratory force pulses, in a one-to-one
relationship with sonic grains. An informal pilot psychophysics experiment
initiating the approach was conducted, which took into account the two main
cases for applying forces to the human skin: perpendicular, and lateral.
Initial results indicate that the force pulse approach can enable perceivably
multidimensional, tangible display of the ongoing grain-generating process.
Moreover, it was found that this can be made to meaningfully happen (in real
time) in the same timescale of basic sonic grain generation. This is not a
trivial property, and provides an important and positive fundament for further
developing this type of enhanced display. It also leads to the exciting
prospect of making arbitrary sonic grains actual physical manipulanda. Keywords: instrumental control, tangible display, tangible manipulation, granular
sound synthesis | |||
| Sound Selection by Gestures | | BIBAK | PDF | 329-330 | |
| Baptiste Caramiaux; Frédéric Bevilacqua; Norbert Schnell | |||
| This paper presents a prototypical tool for sound selection driven by users'
gestures. Sound selection by gestures is a particular case of "query by
content" in multimedia databases. Gesture-to-Sound matching is based on
computing the similarity between both gesture and sound parameters' temporal
evolution. The tool presents three algorithms for matching gesture query to
sound target. The system leads to several applications in sound design, virtual
instrument design and interactive installation. Keywords: Query by Gesture, Time Series Analysis, Sonic Interaction | |||
| An Open Source Interface based on Biological Neural Networks for Interactive Music Performance | | BIBAK | PDF | 331-336 | |
| Hernán Kerlleñevich; Manuel C. Eguía; Pablo E. Riera | |||
| We propose and discuss an open source real-time interface that focuses in
the vast potential for interactive sound art creation emerging from biological
neural networks, as paradigmatic complex systems for musical exploration. In
particular, we focus on networks that are responsible for the generation of
rhythmic patterns. The interface relies upon the idea of relating
metaphorically neural behaviors to electronic and acoustic instruments notes,
by means of flexible mapping strategies. The user can intuitively design
network configurations by dynamically creating neurons and configuring their
inter-connectivity. The core of the system is based in events emerging from his
network design, which functions in a similar way to what happens in real small
neural networks. Having multiple signal and data inputs and outputs, as well as
standard communications protocols such as MIDI, OSC and TCP/IP, it becomes and
unique tool for composers and performers, suitable for different performance
scenarios, like live electronics, sound installations and telematic concerts. Keywords: rhythm generation, biological neural networks, complex patterns, musical
interface, network performance | |||
| Recognition Of Multivariate Temporal Musical Gestures Using N-Dimensional Dynamic Time Warping | | BIBAK | PDF | 337-342 | |
| Nicholas Gillian; Benjamin Knapp; Sile O'Modhrain | |||
| This paper presents a novel algorithm that has been specifically designed
for the recognition of multivariate temporal musical gestures. The algorithm is
based on Dynamic Time Warping and has been extended to classify any
Ndimensional signal, automatically compute a classification threshold to reject
any data that is not a valid gesture and be quickly trained with a low number
of training examples. The algorithm is evaluated using a database of 10
temporal gestures performed by 10 participants achieving an average
cross-validation result of 99%. Keywords: Dynamic Time Warping, Gesture Recognition, Musician-Computer Interaction,
Multivariate Temporal Gestures | |||
| A Machine Learning Toolbox For Musician Computer Interaction | | BIBAK | PDF | 343-348 | |
| Nicholas Gillian; Benjamin Knapp; Sile O'Modhrain | |||
| This paper presents the SARC EyesWeb Catalog, (SEC), a machine learning
toolbox that has been specifically developed for musician-computer interaction.
The SEC features a large number of machine learning algorithms that can be used
in real-time to recognise static postures, perform regression and classify
multivariate temporal gestures. The algorithms within the toolbox have been
designed to work with any N-dimensional signal and can be quickly trained with
a small number of training examples. We also provide the motivation for the
algorithms used for the recognition of musical gestures to achieve a low
intra-personal generalisation error, as opposed to the inter-personal
generalisation error that is more common in other areas of human-computer
interaction. Keywords: Machine learning, gesture recognition, musician-computer interaction, SEC | |||
| Music and Technology in Death and the Powers | | BIBAK | PDF | 349-354 | |
| Elena Jessop; Peter A. Torpey; Benjamin Bloomberg | |||
| In composer Tod Machover's new opera Death and the Powers, the main
character uploads his consciousness into an elaborate computer system to
preserve his essence and agency after his corporeal death. Consequently, for
much of the opera, the stage and the environment itself come alive as the main
character. This creative need brings with it a host of technical challenges and
opportunities. In order to satisfy the needs of this storyline, Machover's
Opera of the Future group at the MIT Media Lab has developed a suite of new
performance technologies, including robot characters, interactive performance
capture systems, mapping systems for authoring interactive multimedia
performances, new musical instruments, unique spatialized sound controls, and a
unified control system for all these technological components. While developed
for a particular theatrical production, many of the concepts and design
procedures remain relevant to broader contexts including performance, robotics,
and interaction design. Keywords: opera, Death and the Powers, Tod Machover, gestural interfaces, Disembodied
Performance, ambisonics | |||
| Design and Evaluation of a Hybrid Reality Performance | | BIBAK | PDF | 355-360 | |
| Victor Zappi; Dario Mazzanti; Andrea Brogni; Darwin Caldwell | |||
| In this paper we introduce a multimodal platform for Hybrid Reality live
performances: by means of non-invasive Virtual Reality technology, we developed
a system to present artists and interactive virtual objects in audio/visual
choreographies on the same real stage. These choreographies could include
spectators too, providing them with the possibility to directly modify the
scene and its audio/visual features. We also introduce the first interactive
performance staged with this technology, in which an electronic musician played
live five tracks manipulating the 3D projected visuals. As questionnaires have
been distributed after the show, in the last part of this work we discuss the
analysis of collected data, underlining positive and negative aspects of the
proposed experience. This paper belongs together with a performance proposal
called Dissonance, in which two performers exploit the platform to create a
progressive soundtrack along with the exploration of an interactive virtual
environment. Keywords: Interactive Performance, Hybrid Choreographies, Virtual Reality, Music
Control | |||
| InkSplorer: Exploring Musical Ideas on Paper and Computer | | BIBAK | PDF | 361-366 | |
| Jérémie Garcia; Theophanis Tsandilas; Carlos Agon; Wendy E. Mackay | |||
| We conducted three studies with contemporary music composers at IRCAM. We
found that even highly computer-literate composers use an iterative process
that begins with expressing musical ideas on paper, followed by active parallel
exploration on paper and in software, prior to final execution of their ideas
as an original score. We conducted a participatory design study that focused on
the creative exploration phase, to design tools that help composers better
integrate their paper-based and electronic activities. We then developed
InkSplorer as a technology probe that connects users' hand-written gestures on
paper to Max/MSP and OpenMusic. Composers appropriated InkSplorer according to
their preferred composition styles, emphasizing its ability to help them
quickly explore musical ideas on paper as they interact with the computer. We
conclude with recommendations for designing interactive paper tools that
support the creative process, letting users explore musical ideas both on paper
and electronically. Keywords: Composer, Creativity, Design Exploration, InkSplorer, Interactive Paper,
OpenMusic, Technology Probes | |||
| Battle of the DJs: an HCI Perspective of Traditional, Virtual, Hybrid and Multitouch DJing | | BIBAK | PDF | 367-372 | |
| Pedro Lopez; Alfredo Ferreira; J. A. Madeiras Pereira | |||
| The DJ culture uses a gesture lexicon strongly rooted in the traditional
setup of turntables and a mixer. As novel tools are introduced in the DJ
community, this lexicon is adapted to the features they provide. In particular,
multitouch technologies can offer a new syntax while still supporting the old
lexicon, which is desired by DJs. We present a classification of DJ tools, from
an interaction point of view, that divides the previous work into Traditional,
Virtual and Hybrid setups. Moreover, we present a multitouch tabletop
application, developed with a group of DJ consultants to ensure an adequate
implementation of the traditional gesture lexicon. To conclude, we conduct an
expert evaluation, with ten DJ users in which we compare the three DJ setups
with our prototype. The study revealed that our proposal suits expectations of
Club/Radio-DJs, but fails against the mental model of Scratch-DJs, due to the
lack of haptic feedback to represent the record's physical rotation.
Furthermore, tests show that our multitouch DJ setup, reduces task duration
when compared with Virtual setups. Keywords: DJing, Multitouch Interaction, Expert User evaluation, HCI | |||
| Designing Digital Musical Interactions in Experimental Contexts | | BIBAK | PDF | 373-376 | |
| Adnan Marquez-Borbon; Michael Gurevich; A. Cavan Fyans; Paul Stapleton | |||
| As NIME's focus has expanded beyond the design reports which were pervasive
in the early days to include studies and experiments involving music control
devices, we report on a particular area of activity that has been overlooked:
designs of music devices in experimental contexts. We demonstrate this is
distinct from designing for artistic performances, with a unique set of novel
challenges. A survey of methodological approaches to experiments in NIME
reveals a tendency to rely on existing instruments or evaluations of new
devices designed for broader creative application. We present two examples from
our own studies that reveal the merits of designing purpose-built devices for
experimental contexts. Keywords: Experiment, Methodology, Instrument Design, DMIs | |||
| Crackle: A Dynamic Mobile Multitouch Topology for Exploratory Sound Interaction | | BIBAK | PDF | 377-380 | |
| Jonathan Reus | |||
| This paper describes the design of Crackle, a interactive sound and touch
experience inspired by the CrackleBox. We begin by describing a ruleset for
Crackle's interaction derived from the salient interactive qualities of the
CrackleBox. An implementation strategy is then described for realizing the
ruleset as an application for the iPhone. The paper goes on to consider the
potential of using Crackle as an encapsulated interaction paradigm for
exploring arbitrary sound spaces, and concludes with lessons learned on
designing for multitouch surfaces as expressive input sensors. Keywords: touchscreen, interface topology, mobile music, interaction paradigm, dynamic
mapping, CrackleBox, iPhone | |||
| A Principled Approach to Developing New Languages for Live Coding | | BIBAK | PDF | 381-386 | |
| Samuel Aaron; Alan Blackwell; Richard Hoadley; Tim Regan | |||
| This paper introduces Improcess, a novel cross-disciplinary collaborative
project focussed on the design and development of tools to structure the
communication between performer and musical process. We describe a 3-tiered
architecture centering around the notion of a Common Music Runtime, a shared
platform on top of which inter-operating client interfaces may be combined to
form new musical instruments. This approach allows hardware devices such as the
monome to act as an extended hardware interface with the same power to initiate
and control musical processes as a bespoke programming language. Finally, we
reflect on the structure of the collaborative project itself, which offers an
opportunity to discuss general research strategy for conducting highly
sophisticated technical research within a performing arts environment such as
the development of a personal regime of preparation for performance. Keywords: Improvisation, live coding, controllers, monome, collaboration, concurrency,
abstractions | |||
| Integra Live: a New Graphical User Interface for Live Electronic Music | | BIBAK | PDF | 387-392 | |
| Jamie Bullock; Daniel Beattie; Jerome Turner | |||
| In this paper we describe a new application, Integra Live, designed to
address the problems associated with software usability in live electronic
music. We begin by outlining the primary usability and user-experience issues
relating to the predominance of graphical dataflow languages for the
composition and performance of live electronics. We then discuss the specific
development methodologies chosen to address these issues, and illustrate how
adopting a user-centred approach has resulted in a more usable and humane
interface design. The main components and workflows of the user interface are
discussed, giving a rationale for key design decisions. User testing processes
and results are presented. Finally, a critical evaluation application usability
is given based on user-testing processes, with key findings presented for
future consideration. Keywords: software, live electronics, usability, user experience | |||
| Robust and Reliable Fabric, Piezoresistive Multitouch Sensing Surfaces for Musical Controllers | | BIBAK | PDF | 393-398 | |
| Jung-Sim Roh; Yotam Mann; Adrian Freed; David Wessel | |||
| The design space of fabric multitouch surface interaction is explored with
emphasis on novel materials and construction techniques aimed towards reliable,
repairable pressure sensing surfaces for musical applications. Keywords: Multitouch, surface interaction, piezoresistive, fabric sensor, e-textiles,
tangible computing, drum controller | |||
| Examining the Effects of Embedded Vibrotactile Feedback on the Feel of a Digital Musical Instrument | | BIBAK | PDF | 399-404 | |
| Mark T. Marshall; Marcelo M. Wanderley | |||
| This paper deals with the effects of integrated vibrotactile feedback on the
"feel" of a digital musical instrument (DMI). Building on previous work
developing a DMI with integrated vibrotactile feedback actuators, we discuss
how to produce instrument-like vibrations, compare these simulated vibrations
with those produced by an acoustic instrument and examine how the integration
of this feedback effects performer ratings of the instrument. We found that
integrated vibrotactile feedback resulted in an increase in performer
engagement with the instrument, but resulted in a reduction in the perceived
control of the instrument. We discuss these results and their implications for
the design of new digital musical instruments. Keywords: Vibrotactile Feedback, Digital Musical Instruments, Feel, Loudspeakers | |||
| HIDUINO: A firmware for building driverless USB-MIDI devices using the Arduino microcontroller | | BIBAK | PDF | 405-408 | |
| Dimitri Diakopoulos; Ajay Kapur | |||
| This paper presents a series of open-source firmwares for the latest
iteration of the popular Arduino microcontroller platform. A portmanteau of
Human Interface Device and Arduino, the HIDUINO project tackles a major problem
in designing NIMEs: easily and reliably communicating with a host computer
using standard MIDI over USB. HIDUINO was developed in conjunction with a class
at the California Institute of the Arts intended to teach introductory-level
human-computer and human-robot interaction within the context of musical
controllers. We describe our frustration with existing microcontroller
platforms and our experiences using the new firmware to facilitate the
development and prototyping of new music controllers. Keywords: Arduino, USB, HID, MIDI, HCI, controllers, microcontrollers | |||
| Latency Improvement in Sensor Wireless Transmission Using IEEE 802.15.4 | | BIBAK | PDF | 409-412 | |
| Emmanuel Fléty; Côme Maestracci | |||
| We present a strategy for the improvement of wireless sensor data
transmission latency, implemented in two current projects involving
gesture/control sound interaction. Our platform was designed to be capable of
accepting accessories using a digital bus. The receiver features a IEEE
802.15.4 microcontroller associated to a TCP/IP stack integrated circuit that
transmits the received wireless data to a host computer using the Open Sound
Control protocol. This paper details how we improved the latency and sample
rate of the said technology while keeping the device small and scalable. Keywords: Embedded sensors, gesture recognition, wireless, sound and music computing,
interaction, 802.15.4, ZigBee | |||
| Snyderphonics Manta Controller, a Novel USB Touch-Controller | | BIBAK | PDF | 413-416 | |
| Jeff Snyder | |||
| The Snyderphonics Manta controller is a USB touch controller for music and
video. It features 48 capacitive touch sensors, arranged in a hexagonal grid,
with bi-color LEDs that are programmable from the computer. The sensors send
continuous data proportional to surface area touched, and a velocitydetection
algorithm has been implemented to estimate attack velocity based on this touch
data. In addition to these hexagonal sensors, the Manta has two high-dimension
touch sliders (giving 12-bit values), and four assignable function buttons. In
this paper, I outline the features of the controller, the available methods for
communicating between the device and a computer, and some current uses for the
controller. Keywords: Snyderphonics, Manta, controller, USB, capacitive, touch, sensor, decoupled
LED, hexagon, grid, touch slider, HID, portable, wood, live music, live video | |||
| On Movement, Structure and Abstraction in Generative Audiovisual Improvisation | | BIBAK | PDF | 417-420 | |
| William Hsu | |||
| This paper overviews audiovisual performance systems that form the basis for
my recent collaborations with improvising musicians. Simulations of natural
processes, such as fluid dynamics and flocking, provide the foundations for
"organic"-looking movement and evolution of abstract visual components. In
addition, visual components can morph between abstract non-referential
configurations and pre-defined images, symbols or shapes. High-level behavioral
characteristics of the visual components are influenced by realtime gestural or
audio input; each system constitutes a responsive environment that
participating musicians interact with during a performance. Keywords: Improvisation, interactive, generative, animation, audio-visual | |||
| Creating Interactive Multimedia Works with Bio-data | | BIBAK | PDF | 421-424 | |
| Claudia R. Angel | |||
| This paper deals with the usage of bio-data from performers to create
interactive multimedia performances or installations. It presents this type of
research in some art works produced in the last fifty years (such as Lucier's
Music for a Solo Performance, from 1965), including two interactive
performances of my authorship, which use two different types of bio-interfaces:
on the one hand, an EMG (Electromyography) and on the other hand, an EEG
(electroencephalography). The paper explores the interaction between the human
body and real-time media (audio and visual) by the usage of bio-interfaces.
This research is based on biofeedback investigations pursued by the
psychologist Neal E. Miller in the 1960s, mainly based on finding new methods
to reduce stress. However, this article explains and shows examples in which
biofeedback research is used for artistic purposes only. Keywords: Live electronics, Butoh, performance, biofeedback, interactive sound and
video | |||
| TresnaNet Musical Generation based on Network Protocols | | BIBAK | PDF | 425-428 | |
| Paula Ustarroz | |||
| TresnaNet explores the potential of Telematics as a generator of musical
expressions. I pretend to sound the silent flow of information from the
network. This is realized through the fabrication of a prototype following the
intention of giving substance to the intangible parameters of our
communication. The result may have educational, commercial and artistic
applications because it is a physical and perceptible representation of the
transfer of information over the network. This paper describes the design,
implementation and conclusions about TresnaNet. Keywords: Interface, musical generation, telematics, network, musical instrument,
network sniffer | |||
| Designing a Music Performance Space for Persons with Intellectual Learning Disabilities | | BIBAK | PDF | 429-432 | |
| Matti Luhtala; Tiina Kymäläinen; Johan Plomp | |||
| This paper outlines the design and development process of the 'DIYSE Music
Creation Tool' concept, by presenting key questions, the used methodology, the
music instrument prototype development process and user research activities.
The aim of this research is to study how music therapists (or instructors) can
utilize novel technologies and study new performing opportunities in the music
therapy context, with people who have intellectual learning disabilities. The
research applies an action research approach to develop new music technologies
by co-designing with the music therapists, in order to develop in situ and
improve the adoption of novel technologies. The proof-of-concept software
utilizes Guitar Hero guitar controllers, and the software allows the music
therapist to personalize interaction mappings between the physical and digital
instrument components. By means of the guitars, the users are able to
participate in various musical activities; they are able to play prepared
musical compositions without extensive training, play together and perform for
others. User research studies included the evaluation of the tool and research
for performance opportunities. Keywords: Music interfaces, music therapy, modifiable interfaces, design tools,
Human-Technology Interaction (HTI), User-Centred Design (UCD), design for all
(DfA), prototyping, performance | |||
| Raja -- A Multidisciplinary Artistic Performance | | BIBAK | PDF | 433-436 | |
| Tom Ahola; Koray Tahiroglu; Teemu Ahmaniemi; Fabio Belloni; Ville Ranki | |||
| Motion-based interactive systems have long been utilized in contemporary
dance performances. These performances bring new insight to sound-action
experiences in multidisciplinary art forms. This paper discusses the related
technology within the framework of the dance piece, Raja. The performance set
up of Raja gives a possibility to use two complementary tracking systems and
two alternative choices for motion sensors in real-time audio-visual synthesis. Keywords: raja, performance, dance, motion sensor, accelerometer, gyro, positioning,
sonification, pure data, visualization, Qt | |||
| Eobody3: a Ready-to-use Pre-mapped & Multi-protocol Sensor Interface | | BIBAK | PDF | 437-440 | |
| Emmanuelle Gallin; Marc Sirguy | |||
| Away from the DIY world of Arduino programmers, Eowave has been developing
Eobody interfaces, a range of ready-to-use sensor interfaces designed for
meta-instruments, music control, and interactive installations... With Eobody3,
we wanted to create this missing link between the analogue and digital worlds,
make it possible to control analogue devices with a digital device and vice
versa: for example, to control a modular synthesizer with an IPad with no
computer and vice versa. With its compatibility with USB, MIDI, OSC, CV and DMX
protocols, Eobody3 is a two-way bridge between the analogue and digital
worlds... This paper describes the challenge of designing a ready-to-use,
pre-mapped, multi-protocol interface for all types of applications. Keywords: Controller, Sensor, MIDI, USB, Computer Music, USB, OSC, CV, MIDI, DMX, A/D
Converter, Interface | |||
| Eye Tapping: How to Beat Out an Accurate Rhythm using Eye Movements | | BIBAK | PDF | 441-444 | |
| Rasmus Bååth; Thomas Strandberg; Christian Balkenius | |||
| The aim of this study was to investigate how well subjects beat out a rhythm
using eye movements and to establish the most accurate method of doing this.
Eighteen subjects participated in an experiment were five different methods
were evaluated. A fixation based method was found to be the most accurate. All
subjects were able to synchronize their eye movements with a given beat but the
accuracy was much lower than usually found in finger tapping studies. Many
parts of the body are used to make music but so far, with a few exceptions, the
eyes have been silent. The research presented here provides guidelines for
implementing eye controlled musical interfaces. Such interfaces would enable
performers and artists to use eye movement for musical expression and would
open up new, exiting possibilities. Keywords: Rhythm, Eye tracking, Sensorimotor synchronization, Eye tapping | |||
| MelodyMorph: A Reconfigurable Musical Instrument | | BIBAK | PDF | 445-447 | |
| Eric Rosenbaum | |||
| I present MelodyMorph, a reconfigurable musical instrument designed with a
focus on melodic improvisation. It is designed for a touch-screen interface,
and allows the user to create "bells" which can be tapped to play a note, and
dragged around on a pannable and zoomable canvas. Colors, textures and shapes
of the bells represent pitch and timbre properties. "Recorder bells" can store
and play back performances. Users can construct instruments that are modifiable
as they play, and build up complex melodies hierarchically from simple parts. Keywords: Melody, improvisation, representation, multi-touch, iPad | |||
| The Flo)(ps: Negotiating Between Habitual and Explorative Gestures | | BIBAK | PDF | 448-452 | |
| Karmen Franinovic | |||
| The perceived affordances of an everyday object guide its user toward
habitual movements and experiences. Physical actions that are not immediately
associated with established body techniques often remain neglected. Can sound
activate those potentials for action that remain latent in the physicality of
an object? How can the exploration of underused and unusual bodily movements be
fostered? This paper presents the Flo)(ps project, a series of interactive
sounding glasses, which aim to foster social interaction by means of habitual
and explorative sonic gestures within everyday contexts. We discuss the design
process and the qualitative evaluation of collaborative and individual user
experience. The results show that social interaction and personal use require
different ways of transitioning from habitual to explorative gestures, and
point toward possible solutions to be further explored. Keywords: sonic interaction design, gesture, habit, exploration | |||
| Wekinating 000000Swan: Using Machine Learning to Create and Control Complex Artistic Systems | | BIBAK | PDF | 453-456 | |
| Margaret Schedel; Phoenix Perry; Rebecca Fiebrink | |||
| In this paper we discuss how the band 000000Swan uses machine learning to
parse complex sensor data and create intricate artistic systems for live
performance. Using the Wekinator software for interactive machine learning, we
have created discrete and continuous models for controlling audio and visual
environments using human gestures sensed by a commercially-available sensor bow
and the Microsoft Kinect. In particular, we have employed machine learning to
quickly and easily prototype complex relationships between performer gesture
and performative outcome. Keywords: Wekinator, K-Bow, Machine Learning, Interactive, Multimedia, Kinect,
Motion-Tracking, Bow Articulation, Animation | |||
| MTCF: A Framework for Designing and Coding Musical Tabletop Applications Directly in Pure Data | | BIBAK | PDF | 457-460 | |
| Carles F. Julià; Daniel Gallardo; Sergi Jordà | |||
| In the past decade we have seen a growing presence of tabletop systems
applied to music, lately with even some products becoming commercially
available and being used by professional musicians in concerts. The development
of this type of applications requires several demanding technical expertises
such as input processing, graphical design, real time sound generation or
interaction design, and because of this complexity they are usually developed
by a multidisciplinary group. In this paper we present the Musical Tabletop
Coding Framework (MTCF) a framework for designing and coding musical tabletop
applications by using the graphical programming language for digital sound
processing Pure Data (Pd). With this framework we try to simplify the creation
process of such type of interfaces, by removing the need of any programming
skills other than those of Pd. Keywords: Pure Data, tabletop, tangible, framework | |||
| Physical Modelling Enabling Enaction: an Example | | BIBAK | PDF | 461-464 | |
| David Pirrò; Gerhard Eckel | |||
| In this paper we present research which can be placed in the context of
performance-oriented computer music. Our research aims at finding new
strategies for the realization of enactive interfaces for performers. We
present an approach developed in experimental processes and we clarify it by
introducing a concrete example. Our method involves physical modelling as an
intermediate layer between bodily movement and sound synthesis. The historical
and technological context in which this research takes place is outlined. We
describe our approach and the hypotheses on which our investigations ground.
The technological frame in which our research took place is briefly described.
The piece cornerghostaxis#1 is presented as an example of this approach. The
observations made during the rehearsals and the performance of this piece are
outlined. Grounding on ours and the performers' experiences, we indicate the
most valuable qualities of this approach, sketch the direction our future
experimentation and development will take, pointing out the issues we will
concentrate on. Keywords: Interaction, Physical Modelling, Motion Tracking, Embodiment, Enactive
interfaces | |||
| SoundGrasp: A Gestural Interface for the Performance of Live Music | | BIBAK | PDF | 465-468 | |
| Thomas Mitchell; Imogen Heap | |||
| This paper documents the first developmental phase of an interface that
enables the performance of live music using gestures and body movements. The
work included focuses on the first step of this project: the composition and
performance of live music using hand gestures captured using a single data
glove. The paper provides a background to the field, the aim of the project and
a technical description of the work completed so far. This includes the
development of a robust posture vocabulary, an artificial neural network-based
posture identification process and a state-based system to map identified
postures onto a set of performance processes. The paper is closed with
qualitative usage observations and a projection of future plans. Keywords: Music Controller, Gestural Music, Data Glove, Neural Network, Live Music
Composition, Looping, Imogen Heap | |||
| Minding the (Transatlantic) Gap: An Internet-Enabled Acoustic Brain-Computer Music Interface | | BIBAK | PDF | 469-472 | |
| Tim Mullen; Richard Warp; Adam Jansch | |||
| The use of non-invasive electroencephalography (EEG) in the experimental
arts is not a novel concept. Since 1965, EEG has been used in a large number
of, sometimes highly sophisticated, systems for musical and artistic
expression. However, since the advent of the synthesizer, most such systems
have utilized digital and/or synthesized media in sonifying the EEG signals.
There have been relatively few attempts to create interfaces for musical
expression that allow one to mechanically manipulate acoustic instruments by
modulating one's mental state. Secondly, few such systems afford a distributed
performance medium, with data transfer and audience participation occurring
over the Internet. The use of acoustic instruments and Internet-enabled
communication expands the realm of possibilities for musical expression in
Brain-Computer Music Interfaces (BCMI), while also introducing additional
challenges. In this paper we report and examine a first demonstration (Music
for Online Performer) of a novel system for Internet-enabled manipulation of
robotic acoustic instruments, with feedback, using a non-invasive EEG-based BCI
and low-cost, commercially available robotics hardware. Keywords: EEG, Brain-Computer Music Interface, Internet, Arduino | |||
| Rhythm'n'Shoes: a Wearable Foot Tapping Interface with Audio-Tactile Feedback | | BIBAK | PDF | 473-476 | |
| Stefano Papetti; Marco Civolani; Federico Fontana | |||
| A shoe-based interface is presented, which enables users to play percussive
virtual instruments by tapping their feet. The wearable interface consists of a
pair of sandals equipped with four force sensors and four actuators affording
audiotactile feedback. The sensors provide data via wireless transmission to a
host computer, where they are processed and mapped to a physics-based sound
synthesis engine. Since the system provides OSC and MIDI compatibility,
alternative electronic instruments can be used as well. The audio signals are
then sent back wirelessly to audio-tactile exciters embedded in the sandals'
sole, and optionally to headphones and external loudspeakers. The round-trip
wireless communication only introduces very small latency, thus guaranteeing
coherence and unity in the multimodal percept and allowing tight timing while
playing. Keywords: interface, audio, tactile, foot tapping, embodiment, footwear, wireless,
wearable, mobile | |||
| A Structured Design and Evaluation Model with Application to Rhythmic Interaction Displays | | BIBAK | PDF | 477-480 | |
| Cumhur Erkut; Antti Jylhä; Reha Discioglu | |||
| We present a generic, structured model for design and evaluation of musical
interfaces. This model is development oriented, and it is based on the
fundamental function of the musical interfaces, i.e., to coordinate the human
action and perception for musical expression, subject to human capabilities and
skills. To illustrate the particulars of this model and present it in
operation, we consider the previous design and evaluation phase of iPalmas, our
testbed for exploring rhythmic interaction. Our findings inform the current
design phase of iPalmas visual and auditory displays, where we build on what
has resonated with the test users, and explore further possibilities based on
the evaluation results. Keywords: Rhythmic interaction, multimodal displays, sonification, UML | |||
| A Hair Ribbon Deflection Model for Low-intrusiveness Measurement of Bow Force in Violin Performance | | BIBAK | PDF | 481-486 | |
| Marco Marchini; Panos Papiotis; Alfonso Pérez; Esteban Maestre | |||
| This paper introduces and evaluates a novel methodology for the estimation
of bow pressing force in violin performance, aiming at a reduced intrusiveness
while maintaining high accuracy. The technique is based on using a simplified
physical model of the hair ribbon deflection, and feeding this model solely
with position and orientation measurements of the bow and violin spatial
coordinates. The physical model is both calibrated and evaluated using real
force data acquired by means of a load cell. Keywords: bow pressing force, bow force, pressing force, force, violin playing, bow
simplified physical model, 6DOF, hair ribbon ends, string ends | |||
| Random Access Remixing on the iPad | | BIBAK | PDF | 487-490 | |
| Jon Forsyth; Aron Glennon; Juan P. Bello | |||
| Remixing audio samples is a common technique for the creation of electronic
music, and there are a wide variety of tools available to edit, process, and
recombine pre-recorded audio into new compositions. However, all of these tools
conceive of the timeline of the pre-recorded audio and the playback timeline as
identical. In this paper, we introduce a dual time axis representation in which
these two timelines are described explicitly. We also discuss the random access
remix application for the iPad, an audio sample editor based on this
representation. We describe an initial user study with 15 high school students
that indicates that the random access remix application has the potential to
develop into a useful and interesting tool for composers and performers of
electronic music. Keywords: interactive systems, sample editor, remix, iPad, multi-touch | |||
| Designing the EP Trio: Instrument Identities, Control and Performance Practice in an Electronic Chamber Music Ensemble | | BIBAK | PDF | 491-494 | |
| Erika Donald; Ben Duinker; Eliot Britton | |||
| This paper outlines the formation of the Expanded Performance (EP) trio, a
chamber ensemble comprised of electric cello with sensor bow, augmented digital
percussion, and digital turntable with mixer. Decisions relating to physical
set-ups and control capabilities, sonic identities, and mappings of each
instrument, as well as their roles within the ensemble, are explored. The
contributions of these factors to the design of a coherent, expressive ensemble
and its emerging performance practice are considered. The trio proposes
solutions to creation, rehearsal and performance issues in ensemble live
electronics. Keywords: Live electronics, digital performance, mapping, chamber music, ensemble,
instrument identity | |||
| Perceptions of Skill in Performances with Acoustic and Electronic Instruments | | BIBAK | PDF | 495-498 | |
| A. Cavan Fyans; Michael Gurevich | |||
| We present observations from two separate studies of spectators' perceptions
of musical performances, one involving two acoustic instruments, the other two
electronic instruments. Both studies followed the same qualitative method,
using structured interviews to ascertain and compare spectators' experiences.
In this paper, we focus on outcomes pertaining to perceptions of the
performers' skill, relating to concepts of embodiment and communities of
practice. Keywords: skill, embodiment, perception, effort, control, spectator | |||
| Cognitive Issues in Computer Music Programming | | BIBAK | PDF | 499-502 | |
| Hiroki Nishino | |||
| Programming Languages are the oldest 'new interface for music expression' in
computer music history. Both composers and researchers in computer music still
have considerable interests in computer music programming environments.
However, while many researchers focus on such issues as efficiency, new
paradigm, or new features in computer music programming, cognitive aspects of
computer music programming has been rarely discussed. Such 'cognitive issues'
are of importance when design or usability in computer music programming must
be considered. By contextualizing computer music programming in the psychology
of programming, it is made possible to borrow the technical terms and
theoretical framework from the previous research in the field, which would be
helpful to clarify the problems related to cognitive ergonomics and also
beneficial to design a new programming environment with better usability in
computer music. Keywords: Computer music, programming language, the psychology of programming,
usability | |||
| Seaboard: a New Piano Keyboard-related Interface Combining Discrete and Continuous Control | | BIBAK | PDF | 503-506 | |
| Roland Lamb; Andrew Robertson | |||
| This paper introduces the Seaboard, a new tangible musical instrument which
aims to provide musicians with significant capability to manipulate sound in
real-time in a musically intuitive way. It introduces the core design features
which make the Seaboard unique, and describes the motivation and rationale
behind the design. The fundamental approach to dealing with problems associated
with discrete and continuous inputs is summarized. Keywords: Piano keyboard-related interface, continuous and discrete control, haptic
feedback, Human-Computer Interaction (HCI) | |||
| Music Interfaces for Novice Users: Composing Music on a Public Display with Hand Gestures | | BIBAK | PDF | 507-510 | |
| Gilbert Beyer; Max Meier | |||
| In this paper we report on a public display where the audience is able to
interact not only with visuals, but also with music. The interaction with music
in a public setting involves some challenges, such as that passers-by as
'novice users' engage only momentarily with public displays and often don't
have any musical knowledge. We present a system that allows users to create
harmonic melodies without being in need of a previous training period. Our
software solution enables users to control melodies by the interaction,
utilizing a novel technique of algorithmic composition based on soft
constraints. The proposed algorithm does not generate music randomly, but makes
sure that the interactive music is perceived as harmonic at any time. Since a
certain amount of control over the music is assigned to the user and to ensure
the music can be controlled in an intuitive way, the algorithm further includes
preferences derived from user interaction that can be competing with generating
a harmonic melody. To test our concept of controlling music, we developed a
prototype of a large public display and conducted a user study, exploring how
people would control melodies on such a display with hand gestures. Keywords: Interactive music, public displays, user experience, out-of-home media,
algorithmic composition, soft constraints | |||
| Expanding the Role of the Instrument | | BIBAK | PDF | 511-514 | |
| Birgitta Cappelen; Anders-Petter Andersson | |||
| The traditional role of the musical instrument is to be the working tool of
the professional musician. On the instrument the musician performs music for
the audience to listen to. In this paper we present an interactive
installation, where we expand the role of the instrument to motivate musicking
and cocreation between diverse users. We have made an open installation, where
users can perform a variety of actions in several situations. By using the
abilities of the computer, we have made an installation, which can be
interpreted to have many roles. It can both be an instrument, a co-musician, a
communication partner, a toy, a meeting place and an ambient musical landscape.
The users can dynamically shift between roles, based on their abilities,
knowledge and motivation. Keywords: Role, music instrument, genre, narrative, open, interaction design,
musicking, interactive installation, sound art | |||
| Wireless Digital/Analog Sensors for Music and Dance Performances | | BIBAK | PDF | 515-518 | |
| Todor Todoroff | |||
| We developed very small and light sensors, each equipped with 3-axes
accelerometers, magnetometers and gyroscopes. Those MARG (Magnetic, Angular
Rate, and Gravity) sensors allow for a drift-free attitude computation which in
turn leads to the possibility of recovering the skeleton of body parts that are
of interest for the performance, improving the results of gesture recognition
and allowing to get relative position between the extremities of the limbs and
the torso of the performer. This opens new possibilities in terms of mapping.
We kept our previous approach developed at ARTeM [2]: wireless from the body to
the host computer, but wired through a 4-wire digital bus on the body. By
relieving the need for a transmitter on each sensing node, we could built very
light and flat sensor nodes that can be made invisible under the clothes.
Smaller sensors, coupled with flexible wires on the body, give more freedom of
movement to dancers despite the need for cables on the body. And as the weight
of each sensor node, box included, is only 5 grams (Figure 1), they can also be
put on the upper and lower arm and hand of a violin or viola player, to
retrieve the skeleton from the torso to the hand, without adding any weight
that would disturb the performer. We used those sensors in several performances
with a dancing viola player and in one where she was simultaneously controlling
gas flames interactively. We are currently applying them to other types of
musical performances. Keywords: wireless, MARG, sensors | |||
| Real-time Control and Creative Convolution | | BIBAK | PDF | 519-522 | |
| Trond Engum | |||
| This paper covers and also describes an ongoing research project focusing on
new artistic possibilities by exchanging music technological methods and
techniques between two distinct musical genres. Through my background as a
guitarist and composer in an experimental metal band I have experienced a vast
development in music technology during the last 20 years. This development has
made a great impact in changing the procedures for composing and producing
music within my genre without necessarily changing the strategies of how the
technology is used. The transition from analogue to digital sound technology
not only opened up new ways of manipulating and manoeuvring sound, it also
opened up challenges in how to integrate and control the digital sound
technology as a seamless part of my musical genre. By using techniques and
methods known from electro-acoustic/computer music, and adapting them for use
within my tradition, this research aims to find new strategies for composing
and producing music within my genre. Keywords: Artistic research, strategies for composition and production, convolution,
environmental sounds, real time control | |||
| phrases from Paul Lansky's Six Fantasies | | BIBAK | PDF | 523-526 | |
| Andreas Bergsland | |||
| The Six Fantasies Machine (SFM) is a software instrument that simulates
sounds from Paul Lansky's classic computer music piece from 1979, Six Fantasies
on a Poem by Thomas Campion. The paper describes the design of the instrument
and its user interface and how it can be used in a methodological approach
called the epistemology of simulations by Godøy. In imitating phrases
from Lansky's piece and enabling the creation of variants of these phrases, the
user can get an experience of the essential traits of the phrases. Moreover,
the instrument will give the user hands-on experience with processing
techniques that the composer applied, albeit with a user-friendly interface. Keywords: LPC, software instrument, analysis, modeling, csound | |||
| Gliss: An Intuitive Sequencer for the iPhone and iPad | | BIBAK | PDF | 527-528 | |
| Jan T. von Falkenstein | |||
| Gliss is an application for iOS that lets the user sequence five separate
instruments and play them back in various ways. Sequences can be created by
drawing onto the screen while the sequencer is running. The playhead of the
sequencer can be set to randomly deviate from the drawings or can be controlled
via the accelerometer of the device. This makes Gliss a hybrid of a sequencer,
an instrument and a generative music system. Keywords: Gliss, iOS, iPhone, iPad, interface, UPIC, music, sequencer, accelerometer,
drawing | |||
| Quadrofeelia -- A New Instrument for Sliding into Notes | | BIBAK | PDF | 529-530 | |
| Jiffer Harriman; Locky Casey; Linden Melvin | |||
| This paper describes a new musical instrument inspired by the pedal-steel
guitar, along with its motivations and other considerations. Creating a
multi-dimensional, expressive instrument was the primary driving force. For
these criteria the pedal steel guitar proved an apt model as it allows control
over several instrument parameters simultaneously and continuously. The
parameters we wanted control over were volume, timbre, release time and pitch.
The Quadrofeelia is played with two hands on a horizontal surface. Single notes
and melodies are easily played as well as chordal accompaniment with a variety
of timbres and release times enabling a range of legato and staccato notes in
an intuitive manner with a new yet familiar interface. Keywords: NIME, pedal-steel, electronic, slide, demonstration, membrane, continuous,
ribbon, instrument, polyphony, lead | |||
| SQUEEZY: Extending a Multi-touch Screen with Force Sensing Objects for Controlling Articulatory Synthesis | | BIBAK | PDF | 531-532 | |
| Johnty Wang; Nicolas d'Alessandro; Sidney S. Fels; Bob Pritchard | |||
| This paper describes Squeezy: a low-cost, tangible input device that adds
multi-dimensional input to capacitive multi-touch tablet devices. Force input
is implemented through force sensing resistors mounted on a rubber ball, which
also provides passive haptic feedback. A microcontroller samples and transmits
the measured pressure information. Conductive fabric attached to the finger
contact area translates the touch to the bottom of the ball which allows the
touchscreen to detect the position and orientation. The addition of a tangible,
pressure-sensitive input to a portable multimedia device opens up new
possibilities for expressive musical interfaces and Squeezy is used as a
controller for real-time gesture controlled voice synthesis research. Keywords: Musical controllers, tangible interfaces, force sensor, multitouch, voice
synthesis. | |||
| SWAF: Towards a Web Application Framework for Composition and Documentation of Soundscape | | BIBAK | PDF | 533-534 | |
| Souhwan Choe; Kyogu Lee | |||
| In this paper, we suggest a conceptual model of a Web application framework
for the composition and documentation of soundscape and introduce corresponding
prototype projects, SeoulSoundMap and SoundScape Composer. We also survey the
current Web-based sound projects in terms of soundscape documentation. Keywords: soundscape, web application framework, sound archive, sound map, soundscape
composition, soundscape documentation | |||
| Playing the "MO" -- Gestural Control and Re-Embodiment of Recorded Sound and Music | | BIBAK | PDF | 535-536 | |
| Norbert Schnell; Frédéric Bevilacqua; Nicolas Rasamimanana; Julien Blois; Fabrice Guédy; Emmanuel Fléty | |||
| We are presenting a set of applications that have been realized with the MO
modular wireless motion capture device and a set of software components
integrated into Max/MSP. These applications, created in the context of artistic
projects, music pedagogy, and research, allow for the gestural reembodiment of
recorded sound and music. They demonstrate a large variety of different
"playing techniques" in musical performance using wireless motion sensor
modules in conjunction with gesture analysis and real-time audio processing
components. Keywords: Music, Gesture, Interface, Wireless Sensors, Gesture Recognition, Audio
Processing, Design, Interaction | |||
| (LAND)MOVES | | BIBAK | PDF | 537-538 | |
| Bruno Zamborlin; Giorgio Partesana; Marco Liuni | |||
| (land)moves is an interactive installation: the user's gestures control the
multimedia processing with a total synergy between audio and video synthesis
and treatment. Keywords: mapping gesture-audio-video, gesture recognition, landscape, soundscape | |||
| Can Haptics Make New Music? -- Fader and Plank Demos | | BIBAK | PDF | 539-540 | |
| Bill Verplank; Francesco Georg | |||
| Haptic interfaces using active force-feedback have mostly been used for
emulating existing instruments and making conventional music. With the right
speed, force, precision and software they can also be used to make new sounds
and perhaps new music. The requirements are local microprocessors (for
low-latency and high update rates), strategic sensors (for force as well as
position), and non-linear dynamics (that make for rich overtones and chaotic
music). Keywords: NIME, Haptics, Music Controllers, Microprocessors | |||