HCI Bibliography Home | HCI Conferences | HAID Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
HAID Tables of Contents: 0607080910111213

HAID 2010: International Workshop on Haptic and Audio Interaction Design

Fullname:HAID 2010: Haptic and Audio Interaction Design: 5th International Workshop
Editors:Rolf Nordahl; Stefania Serafin; Federico Fontana; Stephen Brewster
Location:Copenhagen, Denmark
Dates:2010-Sep-16 to 2010-Sep-17
Publisher:Springer Berlin Heidelberg
Series:Lecture Notes in Computer Science 6306
Standard No:DOI: 10.1007/978-3-642-15841-4; ISBN: 978-3-642-15840-7 (print), 978-3-642-15841-4 (online); hcibib: HAID10
Papers:21
Pages:206
Links:Online Proceedings
  1. Multimodal Integration
  2. Tactile and Sonic Explorations
  3. Walking and Navigation Interfaces
  4. Prototype Design and Evaluation
  5. Gestures and Emotions

Multimodal Integration

Cross-Modality Matching of Loudness and Perceived Intensity of Whole-Body Vibrations BIBAKFull-Text 1-9
  Sebastian Merchel; M. Ercan Altinsoy
In this study, two experiments were conducted to determine the point of subjective intensity equality (PSE) of pure tones and sinusoidal whole-body vibrations (WBV) at various frequencies (50 Hz, 100 Hz and 200 Hz). In these experiments, sounds and vertical vibrations were simultaneously presented to subjects using circumaural headphones and a flat hard seat. In total, 10 participants were subjected to tones with a fixed loudness level (40 phon, 60 phon, 80 phon and 100 phon). The participants were asked to match the intensity of the vibration to the loudness of the tone, using the method of adjustment. In the first experiment, the participants were subjected to a vibration and tone with the same frequency. Alternatively, in the second experiment, the frequency of the vibration was maintained at 50 Hz, while that of the tone was varied.
   The results revealed that a 20 phon increase in loudness level resulted in a 5-6 dB increase in matched acceleration level at loudness levels greater than 40 phon. This result was reproducible with small intra-individual variations; however, large inter-individual differences were observed.
Keywords: Cross-Modality Matching; Whole-Body Vibration; Audiotactile Perception; Intensity
Leaping across Modalities: Speed Regulation Messages in Audio and Tactile Domains BIBAKFull-Text 10-19
  Kai Tuuri; Tuomas Eerola; Antti Pirhonen
This study examines three design bases for speed regulation messages by testing their ability to function across modalities. Two of the design bases utilise a method originally intended for sound design and the third uses a method meant for tactile feedback. According to the experimental results, all designs communicate the intended meanings similarly in audio and tactile domains. It was also found that melodic (frequency changes) and rhythmic (segmentation) features of stimuli function differently for each type of message.
Keywords: audio; tactile; crossmodal interactions; crossmodal design
The Effect of Spatial Disparity on the Integration of Auditory and Tactile Information BIBAKFull-Text 20-25
  M. Ercan Altinsoy
Spatial origin is an important cue for humans to determine whether auditory and tactile signals originate from the same event/object or not. This paper addresses spatial factors involved in the integration of auditory and tactile information. Perceptual threshold values for auditory-tactile spatial origin disparity were measured using tactile information and sound such as those generated by touching (scraping) an abrasive paper. The results of the study show that the minimum angle subjects need to notice that the locations of the auditory and tactile events do not coincide is 5.3°. Simultaneously presented tactile stimulation enlarges the auditory localization blur in the horizontal plane. The results show that the perceived location of auditory stimuli is influenced by tactile stimulation.
Keywords: Audiotactile interaction; multimodal integration; localization blur; spatial origin
Parametric Study of Virtual Curvature Recognition: Discrimination Thresholds for Haptic and Visual Sensory Information BIBAKFull-Text 26-36
  W. Jong Yoon; Joel C. Perry; Blake Hannaford
The senses of vision and touch are vital modalities used in the discrimination of objects. In this research effort, a haptic device is used to determine thresholds of curvature discrimination in visual-haptic experiments. Discrimination thresholds are found for each sense independently as well as for combinations of these with and without the presence of conflicting information. Results indicate that on average, the visual sense is about three times more sensitive than the haptic sense in discriminating curvature in virtual environments. It is also noticed that subjects seem to rely more heavily on the sense that contains the most informative cues rather than on any one particular sense, in agreement with the sensory integration model proposed by Ernst and Banks. The authors believe that the resulting thresholds may serve as relative comparisons between perceptual performance of the sensory modalities of vision and haptics in virtual environment.
Keywords: Curvature Recognition; Discrimination Thresholds; Haptic Perception; Sensory Discrepancy
Cross-Modal Frequency Matching: Sound and Whole-Body Vibration BIBAKFull-Text 37-45
  M. Ercan Altinsoy; Sebastian Merchel
Interest in human responses to whole-body vibration has grown, particularly due to the increasing usage of vehicles, e.g. cars, trucks, and helicopters etc. Another reason for growing interest in recent years is the importance of the vibrations generated by the performance of music for multimedia reproduction systems. There is a strong relationship between the frequency of the auditory stimulus and the frequency of the tactile stimulus, which simply results from the physical processes that generate the stimuli. The recordings in different vehicles or in different concert situations show that the whole-body vibration signal is like a low-pass filtered audio signal. The spectral contents, particularly low frequencies, are matched with each other. This correlation plays an important role in our integration mechanism of auditory and tactile information and in the perception of an immersive multimodal event.
   In this study, psychophysical experiments were conducted to investigate, if subjects are able to match the frequencies of two different sensory modalities with each other. In this experiment, sinusoidal sound and vibration signals were used. The auditory stimuli were presented to the subjects via headphones and the tactile stimuli were presented through a vibration seat. The task of the subject was to match the frequency of the whole-body vibration to the frequency of the auditory stimuli. The results show that the subjects are able to match the frequency of both modalities with some tolerances.
Keywords: Whole-body vibration; frequency; cross-modal-matching; audiotactile perception

Tactile and Sonic Explorations

Audioworld: A Spatial Audio Tool for Acoustic and Cognitive Learning BIBAKFull-Text 46-54
  André Melzer; Martin Christof Kindsmüller; Michael Herczeg
The present paper introduces Audioworld, a novel game-like application for goal-oriented computer-supported learning (CSL). In Audioworld, participants localize sound emitting objects depending on their spatial position. Audioworld serves as a flexible low cost test bed for a broad range of human cognitive functions. This comprises the systematic training of spatial navigation and localization skills, but also of verbal skills and phonetic knowledge known to be essential in grammar literacy, for example. The general applicability of Audioworld was confirmed in a pilot study: users rated the overall application concept novel, entertaining, and rewarding.
Keywords: Audio-based localization; computer-supported learning; human cognitive functions; spatial navigation
Exploring Interactive Systems Using Peripheral Sounds BIBAKFull-Text 55-64
  Saskia Bakker; Elise van den Hoven; Berry Eggen
Our everyday interaction in and with the physical world, has facilitated the development of auditory perception skills that enable us to selectively place one auditory channel in the center of our attention and simultaneously monitor others in the periphery. We search for ways to leverage these auditory perception skills in interactive systems. In this paper, we present three working demonstrators that use sound to subtly convey information to users in an open office. To qualitatively evaluate these demonstrators, each of them has been implemented in an office for three weeks. We have seen that such a period of time, sounds can start shifting from the center to the periphery of the attention. Furthermore, we found several issues to be addressed when designing such systems, which can inform future work in this area.
Keywords: Calm Technology; Periphery; Attention; Sound design; Interaction design
Basic Exploration of Narration and Performativity for Sounding Interactive Commodities BIBAKFull-Text 65-74
  Stefano Delle Monache; Daniel Hug; Cumhur Erkut
We present an exploration in sonic interaction design, aimed at integrating the power of narrative sound design with the sonic aesthetics of a physics-based sound synthesis. The emerging process is based on interpretation, and can represent a novel tool in the education of the future generation of interaction designers. In addition, an audio-tactile paradigm, that exploits the potential of the physics-based approach, is introduced.
Keywords: Sonic Interaction Design; Aesthetics; Physics-based Synthesis; Methodology; Narrative Sound Design
Tactile Web Browsing for Blind Users BIBAKFull-Text 75-84
  Ravi Kuber; Wai Yu; M. Sile O'Modhrain
Recent developments in tactile technologies have made them an attractive choice to improve access to non-visual interfaces. This paper describes the design and evaluation of an extension to an existing browser, which enables blind individuals to explore web pages using tactile feedback. Pins are presented via a tactile mouse to communicate the presence of graphical interface objects. Findings from an evaluation have revealed that fifteen participants were able to learn the tactile HTML mappings developed, and were able to perform a range of web-based tasks in a less constrained manner than using a screen reader alone. The mappings presented in this paper, can be used by web developers with limited experience of tactile design, to widen access to their sites.
Keywords: Blind; human factors; tactile; web browsing
Reducing Reversal Errors in Localizing the Source of Sound in Virtual Environment without Head Tracking BIBAFull-Text 85-96
  Vladimir Ortega-González; Samir Garbaya; Frédéric Merienne
This paper presents a study about the effect of using additional audio cueing and Head-Related Transfer Function (HRTF) on human performance in sound source localization task without using head movement. The existing techniques of sound spatialization generate reversal errors. We intend to reduce these errors by introducing sensory cues based on sound effects. We conducted and experimental study to evaluate the impact of additional cues in sound source localization task. The results showed the benefit of combining the additional cues and HRTF in terms of the localization accuracy and the reduction of reversal errors. This technique allows significant reduction of reversal errors compared to the use of the HRTF separately. For instance, this technique could be used to improve audio spatial alerting, spatial tracking and target detection in simulation applications when head movement is not included.

Walking and Navigation Interfaces

Conflicting Audio-haptic Feedback in Physically Based Simulation of Walking Sounds BIBAFull-Text 97-106
  Luca Turchet; Stefania Serafin; Smilen Dimitrov; Rolf Nordahl
We describe an audio-haptic experiment conducted using a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors and actuators. Such experiment was run to examine the ability of subjects to recognize the different surfaces with both coherent and incoherent audio-haptic stimuli. Results show that in this kind of tasks the auditory modality is dominant on the haptic one.
The Influence of Angle Size in Navigation Applications Using Pointing Gestures BIBAKFull-Text 107-116
  Charlotte Magnusson; Kirsten Rassmus-Gröhn; Delphine Szymczak
One factor which can be expected to influence performance in applications where the user points a device in some direction to obtain information is the angle interval in which the user gets feedback. The present study was performed in order to get a better understanding of the influence of this angle interval on navigation performance, gestures and strategies in a more realistic outdoor setting. Results indicate that users are able to handle quite a wide range of angle intervals, although there are differences between narrow and wide intervals. We observe different gestures and strategies used by the users and provide some recommendations on suitable angle intervals. Finally, our observations support the notion that using this type of pointing gesture for navigation is intuitive and easy to use.
Keywords: Non-visual; pointing; gesture; audio; mobile; location based
Audio-tactile Display of Ground Properties Using Interactive Shoes BIBAKFull-Text 117-128
  Stefano Papetti; Federico Fontana; Marco Civolani; Amir Berrezag; Vincent Hayward
We describe an audio-tactile stimulation system that can be worn and that is capable of providing the sensation of walking over grounds of different type. The system includes miniature loudspeakers and broadband vibrotactile transducers embedded in the soles. The system is particularly effective at suggesting grounds that have granular or crumpling properties. By offering a broad spectrum of floor augmentations with moderate technological requirements, the proposed prototype represents a solution that can be easily replicated in the research laboratory. This paper documents the design and features of the diverse components that characterize the prototype in detail, as well as its current limits.
Keywords: Interactive shoes; foot-based interfaces
Efficient Acquisition of Force Data in Interactive Shoe Designs BIBAKFull-Text 129-138
  Marco Civolani; Federico Fontana; Stefano Papetti
A four-channel sensing system is proposed for the capture of force data from the feet during walking tasks. Developed for an instrumented shoe design prototype, the system solves general issues of latency of the response, accuracy of the data, and robustness of the transmission of digital signals to the host computer. Such issues are often left partially unanswered by solutions for which compactness, accessibility and cost are taken into primary consideration. By adopting widely used force sensing (Interlink) and analog-to-digital conversion and pre-processing (Arduino) components, the proposed system is expected to raise interest among interaction designers of interfaces, in which the reliable and sufficiently broadband acquisition of force signals is desired.
Keywords: Force sensing; closed-loop interfaces
A Comparison of Two Wearable Tactile Interfaces with a Complementary Display in Two Orientations BIBAKFull-Text 139-148
  Mayuree Srikulwong; Eamonn O'Neill
Research has shown that two popular forms of wearable tactile displays, a back array and a waist belt, can aid pedestrian navigation by indicating direction. Each type has its proponents and each has been reported as successful in experimental trials, however, no direct experimental comparisons of the two approaches have been reported. We have therefore conducted a series of experiments directly comparing them on a range of measures. In this paper, we present results from a study in which we used a directional line drawing task to compare user performance with these two popular forms of wearable tactile display. We also investigated whether user performance was affected by a match between the plane of the tactile interface and the plane in which the users drew the perceived directions. Finally, we investigated the effect of adding a complementary visual display. The touch screen display on which participants drew the perceived directions presented either a blank display or a visual display of a map indicating eight directions from a central roundabout, corresponding to the eight directions indicated by the tactile stimuli. We found that participants performed significantly faster and more accurately with the belt than with the array whether they had a vertical screen or a horizontal screen. We found no difference in performance with the map display compared to the blank display.
Keywords: Evaluation/methodology; haptic i/o; user interfaces; wearable computers; pedestrian navigation

Prototype Design and Evaluation

Virtual Sequencing with a Tactile Feedback Device BIBAKFull-Text 149-159
  Victor Zappi; Marco Gaudina; Andrea Brogni; Darwin Caldwell
Since the beginning of Virtual Reality many artistic applications were developed, showing how this technology could be exploited not only from a technical point of view, but also in the field of feelings and emotions. Nowadays music is one of the most interesting field of application for Virtual Reality, and many environments provide the user with means to express her/himself; our work follows this direction, aiming at developing a set of multimodal musical interfaces. In this paper we present a first simple virtual sequencer combined with a low cost tactile feedback device: some preliminary experiments were done to analyze how skilled musicians approach this unusual way of making music.
Keywords: Virtual Instrument; OSC-MIDI Controller; Tactile Feedback
The LapSlapper -- Feel the Beat BIBAKFull-Text 160-168
  Mads Stenhoj Andresen; Morten Bach; Kristian Ross Kristensen
The LapSlapper is an inexpensive and low-technology percussive instrument with a digital interface. In a tactile and embodied manner it allows enhanced control and promotes expressive creativity when operating with percussive elements in digital environments. By using piezo-microphones, mounted on a pair of gloves and connected with a stereo signal to a runtime-version of a Max/MSP patch, intuitive haptic properties are achieved with simple means. The LapSlapper improves the physical feeling of playing digital rhythm instruments but the concept holds furthermore the potential to promote exploration and innovation of new, digitally founded rhythmical structures and aesthetics.
Keywords: Music; instrument; MIDI; trigger; percussion; drums; glove; embodied; intuitive; mobility; tactile expression; haptic interface
Product Design Review Application Based on a Vision-Sound-Haptic Interface BIBAKFull-Text 169-178
  Francesco Ferrise; Monica Bordegoni; Joseba Lizaranzu
Most of the activities concerning the design review of new products based on Virtual Reality are conducted from a visual point of view, thus limiting the realism of the reviewing activities. Adding the sense of touch and the sense of hearing to traditional virtual prototypes, may help in making the interaction with the prototype more natural, realistic and similar to the interaction with real prototypes. Consequently, this would also contribute in making design review phases more effective, accurate and reliable. In this paper we describe an application for product design review where haptic, sound and vision channels have been used to simulate the interaction with a household appliance.
Keywords: Multimodal Interaction; Interaction Design; Virtual Prototyping; Product Design Review
The Phantom versus the Falcon: Force Feedback Magnitude Effects on User's Performance during Target Acquisition BIBAFull-Text 179-188
  Lode Vanacken; Joan De Boeck; Karin Coninx
Applying force feedback applications in a therapy environment allows the patient to practice in a more independent manner, with less intervention of the therapist. Currently however, high-end devices such as the Phantom or the HapticMaster are far too expensive to provide a device per patient. Recently Novint launched a low-cost haptic device for the gaming market: the Falcon. In this paper we report on an experiment that we conducted in order to compare the Falcon and the Phantom, based on a Fitts' law targeting task. We deduced physical parameters such as inertia and damping, which were found to be different for the devices. Although from a velocity analysis these differences can be clearly seen, it turns out that the influence of different forces does not show significant differences when taking completion time and error rate into account. From a subjective experiment, we can learn that users allow the Falcon to produce slightly higher forces than the Phantom before forces are judged as too strong.

Gestures and Emotions

Building a Framework for Communication of Emotional State through Interaction with Haptic Devices BIBAKFull-Text 189-196
  Eric W. Cooper; Victor V. Kryssanov; Hitoshi Ogawa
Brief and high speed semantic communication, such as through texting and e-mail, leaves users without the ability to fully comprehend emotional content and vulnerable to emotional misunderstanding. The need to communicate emotional states, or to elicit sympathetic response in the receiver is evident in emotive icons and other relatively new applications of existing modes of communication. Haptic interfaces offer users a non-verbal way to communicate remotely, opening the door to a richer vocabulary and greater accessibility in emotive and affective communication. The studies described here investigate a possible framework for communication through haptic interface devices using existing models of emotional state. The semantic studies offer a look at users' naïve understanding of the emotive content of haptic sensations. Further experiments with haptic devices show that while communication through these modes can be implemented, the range of possible responses depends as much on the type of interaction used as on the users' understanding of emotive content.
Keywords: haptic communication; emotional communication; affective engineering
A Trajectory-Based Approach for Device Independent Gesture Recognition in Multimodal User Interfaces BIBAKFull-Text 197-206
  Mathias Wilhelm; Dirk Roscher; Marco Blumendorf; Sahin Albayrak
With the rise of technology in all areas of life new interaction techniques are required. With gestures and voice being the most natural ways to interact, it is a goal to also support this in human-computer interaction. In this paper, we introduce our approach to multimodal interaction in smart home environments and illustrate how device independent gesture recognition can be of great support in this area. We describe a trajectory-based approach that is applied to support device independent dynamic hand gesture recognition from vision systems, accelerometers or pen devices. The recorded data from the different devices is transformed to a common basis (2D-space) and the feature extraction and recognition is done on this basis. In a comprehensive case study we show the feasibility of the recognition and the integration with a multimodal and adaptive home operating system.
Keywords: gesture recognition; device independence; trajectory matching; generalized Procrustes analysis; multimodality