Mind the Gap: A SIG on Bridging the Gap in Research on Body Sensing, Body
Perception and Multisensory Feedback
SIG Meetings
/
Singh, Aneesha
/
Tajadura-Jimez, Ana
/
Bianchi-Berthouze, Nadia
/
Marquardt, Nic
/
Tentori, Monica
/
Bresin, Roberto
/
Kulic, Dana
Extended Abstracts of the ACM CHI'16 Conference on Human Factors in
Computing Systems
2016-05-07
v.2
p.1092-1095
© Copyright 2016 ACM
Summary: People's perceptions of their own body's appearance, capabilities and
position are constantly updated through sensory cues [10,14] that are naturally
produced by their actions. Increasingly cheap and ubiquitous sensing technology
is being used with multisensory feedback in multiple HCI areas of sports,
health, rehabilitation, psychology, neuroscience, arts and games to alter or
enhance sensory cues to achieve many ends such as enhanced body perception and
body awareness. However, the focus and aims differ between areas. Designing
more effective and efficient multisensory feedback requires an attempt to
bridge the gap between these worlds. This interactive SIG with minute madness
technology presentations, expert sessions, and multidisciplinary discussions
will: (i) bring together HCI researchers from different areas, (ii) discuss
tools, methods and frameworks, and (iii) form a multidisciplinary community to
build synergies for further collaboration.
Nebula: An Interactive Garment Designed for Functional Aesthetics
Interactivity
/
Elblaus, Ludvig
/
Tsaknaki, Vasiliki
/
Lewandowski, Vincent
/
Bresin, Roberto
Extended Abstracts of the ACM CHI'15 Conference on Human Factors in
Computing Systems
2015-04-18
v.2
p.275-278
© Copyright 2015 ACM
Summary: In this paper we present Nebula, a prototype for examining the properties of
textiles, fashion accessories, and digital technologies to arrive at a garment
design that brings these elements together in a cohesive manner. Bridging the
gap between everyday performativity and enactment, we aim at discussing aspects
of the making process, interaction and functional aesthetics that emerged.
Nebula is part of the Sound Clothes project that aims at exploring the
expressive potential of wearable technologies creating sound from motion.
MoodifierLive: Interactive and Collaborative Expressive Music Performance on
Mobile Devices
/
Fabiani, Marco
/
Dubus, Gaël
/
Bresin, Roberto
NIME 2011: New Interfaces for Musical Expression
2011-05-30
p.116-119
Keywords: Expressive performance, gesture, collaborative performance, mobile phone
© Copyright 2011 Authors
Summary: This paper presents MoodifierLive, a mobile phone application for
interactive control of rule-based automatic music performance. Five different
interaction modes are available, of which one allows for collaborative
performances with up to four participants, and two let the user control the
expressive performance using expressive hand gestures. Evaluations indicate
that the application is interesting, fun to use, and that the gesture modes,
especially the one based on data from free expressive gestures, allow for
performances whose emotional content matches that of the gesture that produced
them.
Sound design and perception in walking interactions
Sonic Interaction Design
/
Visell, Y.
/
Fontana, F.
/
Giordano, B. L.
/
Nordahl, R.
/
Serafin, S.
/
Bresin, R.
International Journal of Human-Computer Studies
2009
v.67
n.11
p.947-959
Keywords: Auditory display; Vibrotactile display; Interaction design; Walking
interfaces
© Copyright 2009 Elsevier B.V.
1. Introduction
1.1. Foot-ground interactions and their signatures
1.2. Overview
2. Human perception
2.1. Isolated impact sounds
2.2. Acoustic and multimodal walking events
3. Augmented ground surfaces as walking interfaces
3.1. Physical interaction design
3.2. Control design
3.3. Sound synthesis
3.3.1. Solid surfaces
3.3.2. Aggregate surfaces
3.4. Augmented ground surfaces developed to date
3.5. Example: Eco Tile
4. Affective footstep sounds
5. VR applications and presence studies
5.1. Auditory feedback and motion
6. Conclusions
Summary: This paper reviews the state of the art in the display and perception of
walking generated sounds and tactile vibrations, and their current and
potential future uses in interactive systems. As non-visual information sources
that are closely linked to human activities in diverse environments, such
signals are capable of communicating about the spaces we traverse and
activities we encounter in familiar and intuitive ways. However, in order for
them to be effectively employed in human-computer interfaces, significant
knowledge is required in areas including the perception of acoustic signatures
of walking, and the design, engineering, and evaluation of interfaces that
utilize them. Much of this expertise has accumulated in recent years, although
many questions remain to be explored. We highlight past work and current
research directions in this multidisciplinary area of investigation, and point
to potential future trends.
Sonic interaction design: sound, information and experience
Workshops
/
Rocchesso, Davide
/
Serafin, Stefania
/
Behrendt, Frauke
/
Bernardini, Nicola
/
Bresin, Roberto
/
Eckel, Gerhard
/
Franinovic, Karmen
/
Hermann, Thomas
/
Pauletto, Sandra
/
Susini, Patrick
/
Visell, Yon
Proceedings of ACM CHI 2008 Conference on Human Factors in Computing Systems
2008-04-05
v.2
p.3969-3972
© Copyright 2008 ACM
Summary: Sonic Interaction Design (SID) is an emerging field that is positioned at
the intersection of auditory display, ubiquitous computing, interaction design,
and interactive arts. SID can be used to describe practice and inquiry into any
of various roles that sound may play in the interaction loop between users and
artifacts, services, or environments, in applications that range from the
critical functionality of an alarm, to the artistic significance of a musical
creation. This field is devoted to the privileged role the auditory channel can
assume in exploiting the convergence of computing, communication, and
interactive technologies. An over-emphasis on visual displays has constrained
the development of interactive systems that are capable of making more
appropriate use of the auditory modality. Today the ubiquity of computing and
communication resources allows us to think about sounds in a proactive way.
This workshop puts a spotlight on such issues in the context of the emerging
domain of SID.
Expressive Control of Music and Visual Media by Full-Body Movement
/
Castellano, Ginevra
/
Bresin, Roberto
/
Camurri, Antonio
/
Volpe, Gualtiero
NIME 2007: New Interfaces for Musical Expression
2007-06-06
p.390-391
© Copyright 2007 Authors
Mapping strategies in DJ scratching
Poster Session 2: Gesture Controlled Audio Systems
/
Hansen, Kjetil Falkenberg
/
Bresin, Roberto
NIME 2006: New Interfaces for Musical Expression
2006-06-04
p.188-191
© Copyright 2006 Authors
Affective diary: designing for bodily expressiveness and self-reflection
Work-in-progress
/
Lindstrom, Madelene
/
Stahl, Anna
/
Höök, Kristina
/
Sundstrom, Petra
/
Laaksolathi, Jarmo
/
Combetto, Marco
/
Taylor, Alex
/
Bresin, Roberto
Proceedings of ACM CHI 2006 Conference on Human Factors in Computing Systems
2006-04-22
v.2
p.1037-1042
© Copyright 2006 ACM
Summary: A diary provides a useful means to express inner thoughts and record
experiences of past events. In re-readings, it also provides a resource for
reflection, allowing us to re-experience, brood over or even shed the thoughts
and feelings we've associated with events or people. To expand on the ways in
which we creatively engage in diary-keeping, we have designed an affective
diary that captures some of the physical, bodily aspects of experiences and
emotions -- what we refer to as "affective body memorabilia". The affective
diary assembles sensor data, captured from the user and uploaded via their
mobile phone, to form an ambiguous, abstract colourful body shape. With a range
of other materials from the mobile phone, such as text and MMS messages,
photographs, etc., these shapes are made available to the user. Combining these
materials, the diary is designed to invite reflection and to allow the user to
piece together their own stories.
From Acoustic Cues to an Expressive Agent
Gesture and Music
/
Mancini, Maurizio
/
Bresin, Roberto
/
Pelachaud, Catherine
GW 2005: Gesture Workshop
2005-05-18
p.280-291
© Copyright 2005 Springer-Verlag
Summary: This work proposes a new way for providing feedback to expressivity in music
performance. Starting from studies on the expressivity of music performance we
developed a system in which a visual feedback is given to the user using a
graphical representation of a human face. The first part of the system,
previously developed by researchers at KTH Stockholm and at the University of
Uppsala, allows the real-time extraction and analysis of acoustic cues from the
music performance. Cues extracted are: sound level, tempo, articulation, attack
time, and spectrum energy. From these cues the system provides an high level
interpretation of the emotional intention of the performer which will be
classified into one basic emotion, such as happiness, sadness, or anger. We
have implemented an interface between that system and the embodied
conversational agent Greta, developed at the University of Rome "La Sapienza"
and "University of Paris 8". We model expressivity of the facial animation of
the agent with a set of six dimensions that characterize the manner of behavior
execution. In this paper we will first describe a mapping between the acoustic
cues and the expressivity dimensions of the face. Then we will show how to
determine the facial expression corresponding to the emotional intention
resulting from the acoustic analysis, using music sound level and tempo
characteristics to control the intensity and the temporal variation of muscular
activation.
Rencon 2004: Turing Test for Musical Expression
RENCON Session
/
Hiraga, Rumi
/
Bresin, Roberto
/
Hirata, Keiji
/
Katayose, Haruhiro
NIME 2004: New Interfaces for Musical Expression
2004-06-03
p.120-123
© Copyright 2004 Authors
Analysis of a Genuine Scratch Performance
Gesture in Multimedia and Performing Arts
/
Hansen, Kjetil Falkenberg
/
Bresin, Roberto
GW 2003: Gesture Workshop
2003-04-15
p.519-528
© Copyright 2003 Springer-Verlag
Summary: The art form of manipulating vinyl records done by disc jockeys (DJs) is
called scratching, and has become very popular since its start in the
seventies. Since then turntables are commonly used as expressive musical
instruments in several musical genres. This phenomenon has had a serious impact
on the instrument-making industry, as the sales of turntables and related
equipment have boosted. Despite of this, the acoustics of scratching has been
barely studied until now. In this paper, we illustrate the complexity of
scratching by measuring the gestures of one DJ during a performance. The
analysis of these measurements is important to consider in the design of a
scratch model.