HCI Bibliography Home | HCI Conferences | HAID Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
HAID Tables of Contents: 0607080910111213

HAID 2006: International Workshop on Haptic and Audio Interaction Design

Fullname:HAID 2006: Haptic and Audio Interaction Design: First International Workshop
Editors:David McGookin; Stephen Brewster
Location:Glasgow, Scotland
Dates:2006-Aug-31 to 2006-Sep-01
Publisher:Springer Berlin Heidelberg
Series:Lecture Notes in Computer Science 4129
Standard No:DOI: 10.1007/11821731; ISBN: 978-3-540-37595-1 (print), 978-3-540-37596-8 (online); hcibib: HAID06
Links:Online Proceedings
  1. Interaction
  2. Psychophysics
  3. Music and Gesture
  4. Visual Impairments I
  5. Visual Impairments II
  6. Design I
  7. Design II


Perception of Audio-Generated and Custom Motion Programs in Multimedia Display of Action-Oriented DVD Films BIBAFull-Text 1-11
  Kent Walker; William L. Martens
This paper addresses a practical problem associated with multimedia display systems which utilize motion-platforms or chairs. Given audio-visual content for which motion data is not available, motion may be automatically generated from multichannel audio, usually from a Low-Frequency Effects channel (LFE) such as that distributed on Digital Versatile Discs (DVDs). Alternatively, custom motion programs may be created to accompany multimedia content. This paper presents the results of a study designed to test the sense of realism, sense of presence, and global preference for multimedia playback in these two distinct cases of platform accompaniment: motion which has been generated automatically from audio, and motion which has been designed expressly for the purpose of stimulating appropriate haptic and vestibular sensations.
Evaluating the Influence of Multimodal Feedback on Egocentric Selection Metaphors in Virtual Environments BIBAFull-Text 12-23
  Lode Vanacken; Chris Raymaekers; Karin Coninx
Whether a user interface is intuitive depends amongst others on (multimodal) feedback. The addition of multimodal feedback can certainly improve interaction in Virtual Environments as it increases the bandwidth to the user. One of the most common tasks in Virtual Environments is object selection. This paper elaborates on the enhancement of some existing approaches with multimodal feedback. The proposed techniques have been evaluated through a user experiment and the results show that the addition of multimodal feedback is preferred by the user and depending on the selection metaphor, it can also speed up the interaction.


Haptic-Auditory Rendering and Perception of Contact Stiffness BIBAFull-Text 24-35
  Federico Avanzini; Paolo Crosato
This paper presents an experiment on the relative contributions of haptic and auditory information to bimodal judgments of contact stiffness using a rigid probe. Haptic feedback is rendered via a Phantom® Omni™ device, while auditory stimuli are obtained using a physically-based audio model of impact, in which the colliding objects are described as modal resonators that interact through a non-linear impact force. The impact force can be controlled through a stiffness parameter, that influences the contact time of the impact. Previous studies have already indicated that this parameter has a major influence on the auditory perception of hardness/stiffness. In the experiment subjects had to tap on virtual surfaces, and were presented with audio-haptic feedback. In each condition the haptic stiffness had the same value while the acoustic stiffness was varied. Perceived stiffness was determined using an absolute magnitude-estimation procedure: subjects were asked to rate the surfaces on an ordered scale of verbal labels, based on their perceived stiffness. The results indicate that subjects consistently ranked the surfaces according to the auditory stimuli.
Designing Haptic Feedback for Touch Display: Experimental Study of Perceived Intensity and Integration of Haptic and Audio BIBAFull-Text 36-44
  Ville Tikka; Pauli Laitinen
We studied the subjectively perceived intensity of the haptic feedback and the effects of the integration of the audio and haptic feedback. The purpose of the study was to specify design principles for haptic feedback on a piezo actuator enhanced mobile touch display device. The results of the study showed that the best corresponding physical parameter to perceived feedback intensity was the acceleration of the haptic stimulus pulse. It was also noticed that the audio stimuli was biasing the perception of the haptic stimuli intensity. These results clarify the principles behind haptic feedback design and imply that the multisensory integration should be stressed when designing haptic interaction.

Music and Gesture

Rhythmic Interaction for Song Filtering on a Mobile Device BIBAFull-Text 45-55
  Andrew Crossan; Roderick Murray-Smith
This paper describes a mobile implementation of song filtering using rhythmic interaction. A user taps the screen or shakes the device (sensed through an accelerometer) at the tempo of a particular song in order to listen to it. We use the variability in beat frequency to display ambiguity to allow users to adjust their actions based on the given feedback. The results of a pilot study for a simple object selection task showed that although the tapping interface provided a larger range of comfortable tempos, participants could use both tapping and shaking methods to select a given song. Finally, the effects of variability in a rhythmic interaction style of interface are discussed.
Lemma 4: Haptic Input + Auditory Display = Musical Instrument? BIBAFull-Text 56-67
  Paul Vickers
In this paper we look at some of the design issues that affect the success of multimodal displays that combine acoustic and haptic modalities. First, issues affecting successful sonification design are explored and suggestions are made about how the language of electroacoustic music can assist. Next, haptic interaction is introduced in the light of this discussion, particularly focusing on the roles of gesture and mimesis. Finally, some observations are made regarding some of the issues that arise when the haptic and acoustic modalities are combined in the interface. This paper looks at examples of where auditory and haptic interaction have been successfully combined beyond the strict confines of the human-computer application interface (musical instruments in particular) and discusses lessons that may be drawn from these domains and applied to the world of multimodal human-computer interaction. The argument is made that combined haptic-auditory interaction schemes can be thought of as musical instruments and some of the possible ramifications of this are raised.

Visual Impairments I

Navigation and Control in Haptic Applications Shared by Blind and Sighted Users BIBAFull-Text 68-80
  Eva-Lotta Sallnäs; Kajsa Bjerstedt-Blom; Fredrik Winberg; Kerstin Severinson Eklundh
Haptic feedback in shared virtual environments can potentially make it easier for a visually impaired person to take part in and contribute to the process of group work. In this paper a task driven explorative evaluation is presented of collaboration between visually impaired and sighted persons in three applications that provide haptic and visual feedback. The results show that all pairs could perform all the tasks in these applications even though a number of difficulties were identified. The conclusions made can inform design of applications for cooperation between visually impaired and sighted users.
User Evaluations of a Virtual Haptic-Audio Line Drawing Prototype BIBAFull-Text 81-91
  Kirsten Rassmus-Gröhn; Charlotte Magnusson; Håkan Eftring
A virtual haptic-audio drawing program prototype designed for visually impaired children, has been gradually developed in a design-evaluation loop involving users in four stages. Three qualitative evaluations focused on recognizing drawn shapes and creating drawings have been conducted together with a reference group of 5 visually impaired children. Additionally, one formal pilot test involving 11 adult sighted users investigated the use of a combination of haptic and sound field feedback. In the latter test the relief type (positive and negative) was also varied. Results indicate a subjective preference as well as a shorter examination time for negative relief over positive relief for the interpretation of simple shapes such as 2D geometrical figures. The presence of the position sound field with a pitch and stereo panning analogy was not shown to affect task completion times.

Visual Impairments II

Creating Accessible Bitmapped Graphs for the Internet BIBAFull-Text 92-101
  Graham McAllister; Jacopo Staiano; Wai Yu
Bitmapped graphs are the most frequently found form of graph contained on web pages. However, users who are blind or visually impaired currently find it difficult or impossible to access the data contained within such graphs, typically relying only on the ALT text description. This paper details an approach to creating bitmapped graphs for visually impaired users to access on the Internet. The process employs a combination of manual intervention from a web developer, and novel automatic algorithms that are specific for graph-based images. The approach identifies the important regions of the graph and tags them with meta-data. The meta-data and bitmap graph are then exported to a web page for sonification and exploration by the visually impaired user.
Supporting Cross-Modal Collaboration: Adding a Social Dimension to Accessibility BIBAFull-Text 102-110
  Fredrik Winberg
This paper presents a study of cross-modal collaboration, where blind and sighted persons collaboratively solve two different tasks using a prototype that has one auditory and one graphical interface. The results shows the importance of context and the design of tasks for the accessibility of cross-modal collaborative settings, as well as the importance of supporting the participation in a working division of labour.
Non Visual Haptic Audio Tools for Virtual Environments BIBAFull-Text 111-120
  Charlotte Magnusson; Henrik Danielsson; Kirsten Rassmus-Gröhn
This paper reports the results of a test involving twelve users of different haptic audio navigational tools for non-visual virtual environments. Analysis of the test results confirms the usefulness of a constant attractive force as well as of haptic fixtures to help users locate objects in a virtual environment. The 3D audio turned out to be less useful due to the design of the environment. However, user comments indicate that this type of sound feedback helps spatial understanding. Contrary to expectations, no significant tool effects were seen on spatial memory.

Design I

A Semiotic Approach to the Design of Non-speech Sounds BIBAFull-Text 121-132
  Emma Murphy; Antti Pirhonen; Graham McAllister; Wai Yu
In the field of auditory display there is currently a lack of theoretical support for the design of non-speech sounds as elements of a user interface. Sound design methods are often based on ad hoc choices or the personal preferences of the designer. A method is proposed in this paper based on a semiotic approach to the design of non-speech sounds. In this approach, the design process is conceptualised by referring to structural semiotics, acknowledging the unique qualities of non-speech sounds, as a mode of conveying information. This method is based on a rich use scenario presented to a design panel. A case study where the design method has been applied is presented and evaluated. Finally recommendations for a practical design method are presented supported by this empirical investigation.
Listen to This -- Using Ethnography to Inform the Design of Auditory Interfaces BIBAFull-Text 133-144
  Graeme W. Coleman; Catriona Macaulay; Alan F. Newell
Within the wider Human-Computer Interaction community, many researchers have turned to ethnography to inform systems design. However, such approaches have yet to be fully utilized within auditory interface research, a field hitherto driven by technology-inspired design work and the addressing of specific cognitive issues. It is proposed that the time has come to investigate the role ethnographic methods have to play within auditory interface design. We begin by discussing "traditional" ethnographic methods by presenting our experiences conducting a field study with a major UK-based computer games developer, highlighting issues pertinent to the design of auditory interfaces, before suggesting ways in which such techniques could be expanded to consider the role sound plays in people's lived experiences and thus merit further research.
An Activity Classification for Vibrotactile Phenomena BIBAFull-Text 145-156
  Conor O'Sullivan; Angela Chang
We observe that the recent availability of audio-haptic actuators allow richer vibration content to be available in commercial devices. However, we note that consumers are unable to take advantage of these rich experiences, mainly due to the lack of a descriptive language for vibration. We analyze the current methods for classifying vibrations. We propose a new framework for describing vibrotactile haptic phenomena, based on an organizing the media based on content activity. We describe this naming system, based on Russolo's families of noise, and address other pertinent issues to introducing vibration content into commercial devices.

Design II

Haptic-Audio Narrative: From Physical Simulation to Imaginative Stimulation BIBAFull-Text 157-165
  Stephen Barrass
This paper describes the design and development of an interactive narrative for the 'Experimenta Vanishing Point' media arts exhibition in 2005. The Cocktail Party Effect tells the story of the imminent extinction of Great Apes in the wild using touch and sound in the absence of visual elements. The narrative is driven by haptic-audio exploration of a virtual cocktail glass which functions as a heterodiegetic narrator, and the traversal of cut-up conversations that make up the story within. The interface was developed through a series of prototypes that explored the perception and mental imagery of a haptic-audio simulation of the invisible glass. These experiments also developed narrative functions of the haptic-audio interface beyond conventional iconic metonyms to include grammatical and dramatic special effects. Observations during the exhibition show promising narrative engagement with the piece but identify problems with the clarity of the sounds, and a conflict between the narrator and the story content.