From Ecological Sounding Artifacts Towards Sonic Artifact Ecologies
alt.chi: See this, hear this, touch this, keep this
/
Erkut, Cumhur
/
Serafin, Stefania
Extended Abstracts of the ACM CHI'16 Conference on Human Factors in
Computing Systems
2016-05-07
v.2
p.560-570
© Copyright 2016 ACM
Summary: The discipline of sonic interaction design has been focused on the
interaction between a single user and an artifact. This strongly limits one of
the fundamental aspects of music as a social and interactive experience. In
this paper we propose sonic artifact ecologies as a mean to examine
interactions between one or many users with one or many artifacts. Case studies
from a recently run workshop on product sound design are examined.
Product Sound Design: Form, Function, and Experience
/
Erkut, Cumhur
/
Serafin, Stefania
/
Hoby, Michael
/
Sårde, Jonniy
Proceedings of Audio Mostly 2015: A Conference on Interaction with Sound
2015-10-07
p.10
© Copyright 2015 ACM
Summary: Current interactive products, services, and environments are appraised by
their sensory attributes, in addition to their form and function. Sound is an
important factor in these multisensory product appraisals. Integrating this
sound opportunity into the design and development of interactive products,
which are fit for real-world, yet constitute a strong brand identity, remains a
challenge. We address this challenge by applying the research know-how of an
academic institution and business practices of a sound agency SME within the
core R&D and production process of the third industrial partner. Our
approach has clear application scenarios in, e.g., extended wireless headsets,
car audio appliances, and portable entertainment devices. We describe the
prototypes developed during the project life span, and the activities and
outcomes of a half-day workshop designed to disseminate the project results.
Spatial Sound and Multimodal Interaction in Immersive Environments
/
Grani, Francesco
/
Overholt, Dan
/
Erkut, Cumhur
/
Gelineck, Steven
/
Triantafyllidis, Georgios
/
Nordahl, Rolf
/
Serafin, Stefania
Proceedings of Audio Mostly 2015: A Conference on Interaction with Sound
2015-10-07
p.17
© Copyright 2015 ACM
Summary: Spatial sound and interactivity are key elements of investigation at the
Sound And Music Computing master program at Aalborg University Copenhagen.
We present a collection of research directions and recent results from work
in these areas, with the focus on our multifaceted approaches to two primary
problem areas: 1) creation of interactive spatial audio experiences for
immersive virtual and augmented reality scenarios, and 2) production and mixing
of spatial audio for cinema, music, and other artistic contexts. Several
ongoing research projects are described, wherein the latest developments are
discussed.
These include elements in which we have provided sonic interaction in
virtual environments, interactivity with volumetric sound sources using VBAP
and Wave Field Synthesis (WFS), and binaural sound for virtual environments and
spatial audio mixing. We show that the variety of approaches presented here are
necessary in order to optimize interactivity with spatial audio for each
particular type of task.
Beyond Command & Control: Sketching Embodied Interaction
WIP Theme: Novel Interfaces and Interaction Techniques
/
Erkut, Cumhur
/
Rajala-Erkut, Anu
Extended Abstracts of the ACM CHI'15 Conference on Human Factors in
Computing Systems
2015-04-18
v.2
p.1681-1686
© Copyright 2015 ACM
Summary: We present an approach for teaching and designing embodied interaction in
collaboration with a contemporary dance choreographer. Our approach is based on
the felt qualities of movement, and provides a shared experience, vocabulary
for self-expression, and appreciation for movement as a design material for
interaction design practitioners. We present a workshop, where after movement
sessions, interactive sketches were generated and implemented by motion
tracking. Subsequently, we have investigated whether or not these activities
guided the participants from the prevailing notion of command/control in
embodied interaction towards experiences related to the felt qualities of
movement. While in idea generation, our approach has provided a better
foundation for participants, compared to the approaches that focus only on
technologies, this effect wore off and final implementations focused on command
& control. We currently experiment with new tools and techniques,
integrating material interactions into the process.
Sketches in Embodied Interaction: Balancing Movement and Technological
Perspectives
Design Methods, Techniques and Knowledge
/
Erkut, Cumhur
/
Dahl, Sofia
/
Triantafyllidis, Georgios
HCI International 2014: 16th International Conference on HCI: Posters'
Extended Abstracts, Part I
2014-06-22
v.4
p.30-35
Keywords: Embodied Human-Computer Interaction; Design Pedagogy
© Copyright 2014 Springer International Publishing
Summary: We present an approach for teaching and designing embodied interaction based
on interactive sketches. We have combined the mover perspective and felt
experiences of movement with advanced technologies (multi-agents, physical
simulations) in a generative design session. We report our activities and
provide a simple example as a design outcome. The variety and the qualities of
the initial ideas indicate that this approach might provide a better foundation
for our participants, compared to the approaches that focus only on
technologies. The interactive sketches will be demonstrated at the conference.
Design and evaluation of interactive musical fruit
Wednesday short papers
/
Erkut, Cumhur
/
Serafin, Stefania
/
Fehr, Jonas
/
Figueira, Henrique M. R. Fernandes
/
Hansen, Theis B.
/
Kirwan, Nicholas J.
/
Zakarian, Mariam R.
Proceedings of ACM IDC'14: Interaction Design and Children
2014-06-17
p.197-200
© Copyright 2014 ACM
Summary: In this paper we describe the design and evaluation of a novel, tangible
user interface for interaction with sound, to be implemented in a museum
setting. Our work-in-progress is part of a larger concept for an installation
prioritizing a collaborative, explorative, multimodal experience. Focus has
been centered on novice children, in order to accommodate all potential users
of the museum, and to minimize the risk of excluding users based on skill or
previous musical knowhow. We have developed four instances of a multimodal
device for interacting with sounds via a tangible interface, and called them
Interactive Musical Fruits (IMFs). The IMF consists of an embedded processing
system, which can detect its orientation. Qualitative testing with children has
been performed, to better evaluate the current design state. Positive feedback
from the test subjects upholds the validity and the potential of the IMF as an
interface in a museum context. However, further research is required to improve
the interactive and collaborative aspects of the device, as well as the aural
and visual properties of the IMF.
SiMPE: 8th workshop on speech and sound in mobile and pervasive environments
Workshops
/
Nanavati, Amit A.
/
Rajput, Nitendra
/
Srivastava, Saurabh
/
Erkut, Cumhur
/
Jylhä, Antti
/
Rudnicky, Alexander I.
/
Serafin, Stefania
/
Turunen, Markku
Proceedings of 2013 Conference on Human-computer interaction with mobile
devices and services
2013-08-27
2013-08-27
p.626-629
© Copyright 2013 ACM
Summary: The SiMPE workshop series started in 2006 with the goal of enabling speech
processing on mobile and embedded devices. The SiMPE 2012 workshop extended the
notion of audio to non-speech "Sounds" and thus the expansion became "Speech
and Sound". SiMPE 2010 and 2011 brought together researchers from the speech
and the HCI communities. Speech User interaction in cars was a focus area in
2009. Multimodality got more attention in SiMPE 2008. In SiMPE 2007, the focus
was on developing regions.
With SiMPE 2013, the 8th in the series, we continue to explore the area of
speech along with sound. Akin to language processing and text-to-speech
synthesis in the voice-driven interaction loop, sensors can track continuous
human activities such as singing, walking, or shaking the mobile phone, and
non-speech audio can facilitate continuous interaction. The technologies
underlying speech processing and sound processing are quite different and these
communities have been working mostly independent of each other. And yet, for
multimodal interactions on the mobile, it is perhaps natural to ask whether and
how speech and sound can be mixed and used more effectively and naturally.
Rhythmic walking interactions with auditory feedback: an exploratory study
/
Jylhä, Antti
/
Serafin, Stefania
/
Erkut, Cumhur
Proceedings of the 2012 Audio Mostly Conference: A Conference on Interaction
with Sound
2012-09-26
p.68-75
© Copyright 2012 ACM
Summary: Walking is a natural rhythmic activity that has become of interest as a
means of interacting with software systems such as computer games. Therefore,
designing multimodal walking interactions calls for further examination. This
exploratory study presents a system capable of different kinds of interactions
based on varying the temporal characteristics of the output, using the sound of
human walking as the input. The system either provides a direct synthesis of a
walking sound based on the detected amplitude envelope of the user's footstep
sounds, or provides a continuous synthetic walking sound as a stimulus for the
walking human, either with a fixed tempo or a tempo adapting to the human gait.
In a pilot experiment, the different interaction modes are studied with respect
to their effect on the walking tempo and the experience of the subjects. The
results tentatively outline different user profiles in interacting with such a
system.
Auditory feedback in an interactive rhythmic tutoring system
/
Jylhä, Antti
/
Erkut, Cumhur
Proceedings of the 2011 Audio Mostly Conference: A Conference on Interaction
with Sound
2011-09-07
p.109-115
© Copyright 2011 ACM
Summary: We present the recent developments in the design of audio-visual feedback in
iPalmas, the interactive Flamenco rhythm tutor. Based on evaluation of the
original implementation, we have re-designed the interface to better support
the user in learning and performing rhythmic patterns. The system measures the
performance parameters of the user and provides auditory feedback on the
performance with different sounds corresponding to different performance
attributes. The design of these sounds is informed by several attributes
derived from the evaluation. We propose informative, non-intrusive. and
archetypal sounds to be used in the system.
A Structured Design and Evaluation Model with Application to Rhythmic
Interaction Displays
/
Erkut, Cumhur
/
Jylhä, Antti
/
Discioglu, Reha
NIME 2011: New Interfaces for Musical Expression
2011-05-30
p.477-480
Keywords: Rhythmic interaction, multimodal displays, sonification, UML
© Copyright 2011 Authors
Summary: We present a generic, structured model for design and evaluation of musical
interfaces. This model is development oriented, and it is based on the
fundamental function of the musical interfaces, i.e., to coordinate the human
action and perception for musical expression, subject to human capabilities and
skills. To illustrate the particulars of this model and present it in
operation, we consider the previous design and evaluation phase of iPalmas, our
testbed for exploring rhythmic interaction. Our findings inform the current
design phase of iPalmas visual and auditory displays, where we build on what
has resonated with the test users, and explore further possibilities based on
the evaluation results.
Basic Exploration of Narration and Performativity for Sounding Interactive
Commodities
Tactile and Sonic Explorations
/
Monache, Stefano Delle
/
Hug, Daniel
/
Erkut, Cumhur
HAID 2010: International Workshop on Haptic and Audio Interaction Design
2010-09-16
p.65-74
Keywords: Sonic Interaction Design; Aesthetics; Physics-based Synthesis; Methodology;
Narrative Sound Design
© Copyright 2010 Springer-Verlag
Summary: We present an exploration in sonic interaction design, aimed at integrating
the power of narrative sound design with the sonic aesthetics of a
physics-based sound synthesis. The emerging process is based on interpretation,
and can represent a novel tool in the education of the future generation of
interaction designers. In addition, an audio-tactile paradigm, that exploits
the potential of the physics-based approach, is introduced.
Simulation of rhythmic learning: a case study
/
Jylhä, Antti
/
Erkut, Cumhur
/
Pesonen, Matti
/
Ekman, Inger
Proceedings of the 2010 Audio Mostly Conference: A Conference on Interaction
with Sound
2010-09-15
p.20
© Copyright 2010 ACM
Summary: Simulation of human interaction with computational systems can inform their
design and provide means for designing new, intelligent systems capturing some
of the essence of human behavior. We describe a system simulating a situation,
where a virtual tutor is teaching rhythms to a human learner. In this
simulation, we virtualize the human behavior related to the learning of new
rhythms. We inform the design of the system based on an experiment, in which a
virtual tutor taught Flamenco hand clapping patterns to human subjects. Based
on the findings on interaction with the system and learning of the patterns, we
are simulating this learning situation with a virtual learning clapper. We also
discuss the future work to be undertaken for more realistic, agent-based
simulation of rhythmic interaction.
A hand clap interface for sonic interaction with the computer
Interactivity: touch & feel
/
Jylhä, Antti
/
Erkut, Cumhur
Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems
2009-04-04
v.2
p.3175-3180
Keywords: audio interfaces, hand clapping, human-computer interaction, sonic
interaction design
© Copyright 2009 ACM
Summary: We present a hand clapping interface for sonic interaction with the
computer. The current implementation has been built on the Pure Data (PD)
software. The interface makes use of the cyclic nature of hand clapping and
recognition of the clap type, and enables interactive control over different
applications. Three prototype applications for the interface are presented: a
virtual crowd of clappers, controlling the tempo of music, and a simple
sampler. Preliminary tests indicate that rather than having total control via
the interface, the user negotiates with the computer to control the tempo.