HCI Bibliography Home | HCI Conferences | HAID Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
HAID Tables of Contents: 0607080910111213

HAID 2012: International Workshop on Haptic and Audio Interaction Design

Fullname:HAID 2012: Haptic and Audio Interaction Design: 7th International Conference
Editors:Charlotte Magnusson; Delphine Szymczak; Stephen Brewster
Location:Lund, Sweden
Dates:2012-Aug-23 to 2012-Aug-24
Publisher:Springer Berlin Heidelberg
Series:Lecture Notes in Computer Science 7468
Standard No:DOI: 10.1007/978-3-642-32796-4; ISBN: 978-3-642-32795-7 (print), 978-3-642-32796-4 (online); hcibib: HAID12
Papers:15
Pages:151
Links:Online Proceedings
  1. Haptics and Audio in Navigation
  2. Supporting Experiences and Activities
  3. Object and Interface Interaction
  4. Test and Evaluation

Haptics and Audio in Navigation

Understanding Auditory Navigation to Physical Landmarks BIBAKFull-Text 1-10
  David McGookin; Stephen A. Brewster
We present two studies that seek to better understand the role spatialised (3D) audio can play in supporting effective pedestrian navigation. 24 participants attempted to navigate and locate physical landmarks in a local botanical gardens using a gpsTunes [1] based auditory navigation system coupled with a map. Participants were significantly better at locating prominent than non-prominent physical landmarks. However, no significant quantative difference was found between the use of a map only and map + audio. Qualitative analysis revealed significant issues when physical landmarks are used, and common strategies when combining audio and map navigation. We highlight the implications of these in relation to existing work, and provide guidelines for future designers to employ.
Keywords: 3D Audio; Pedestrian Navigation; Maps; Landmarks
Supporting Sounds: Design and Evaluation of an Audio-Haptic Interface BIBAKFull-Text 11-20
  Emma Murphy; Camille Moussette; Charles Verron; Catherine Guastavino
The design and evaluation of a multimodal interface is presented in order to investigate how spatial audio and haptic feedback can be used to convey the navigational structure of a virtual environment. The non-visual 3D virtual environment is composed of a number of parallel planes with either horizontal or vertical orientations. The interface was evaluated using a target-finding task to explore how auditory feedback can be used in isolation or combined with haptic feedback for navigation. Twenty-three users were asked to locate targets using auditory feedback in the virtual structure across both horizontal and vertical orientations of the planes, with and without haptic feedback. Findings from the evaluation experiment reveal that users performed the task faster in the bi-modal conditions (with combined auditory and haptic feedback) with a horizontal orientation of the virtual planes.
Keywords: Auditory feedback; Haptic feedback; Target-finding; User Evaluation
A Haptic-Audio Interface for Acquiring Spatial Knowledge about Apartments BIBAKFull-Text 21-30
  Junlei Yu; Christopher Habel
In selecting an apartment for residence, floor plans are a common source of relevant information. For visually impaired people, adequate floor plans are widely missing. This paper introduces a haptic-audio assistance system, which is designed and implemented to help visually impaired people to acquire the layout of novel small-scale apartments. Virtual 2.5-D floor plan models are made according to -- traditional visual -- floor plans. Haptic force feedback will be rendered when users explore the virtual model by a PHANToM Omni device. During the exploration, auditory assistance information about floor plans, either by speech or by sonification, is invoked by entering into prescribed areas, which are placed on the inner contour of rooms. Two user studies are presented which demonstrate the usability of the haptic-audio interface. In particular, reinforcement and extra positive influence brought by the employment of multiple modes in audio perception channel is confirmed.
Keywords: Spatial Knowledge Acquisition; Virtual Haptics; Haptic-Audio Floor plan; Sonification

Supporting Experiences and Activities

Mobile Haptic Technology Development through Artistic Exploration BIBAKFull-Text 31-40
  David Cuartielles; Andreas Göransson; Tony Olsson; Ståle Stenslie
This paper investigates how artistic explorations can be useful for the development of mobile haptic technology. It presents an alternative framework of design for wearable haptics that contributes to the building of haptic communities outside specialized research contexts. The paper also presents our various wearable haptic systems for mobile computing capable of producing high-order tactile percepts. Our practice based approach suggests a design framework that can be applied to create advanced haptic stimulations/situations for physically embodied interaction in real-world settings.
Keywords: Applied haptics; wearables; bodysuit; haptic and embodied interaction; haptic resolution; Arduino; Android; mobile haptic systems; online haptics editor
Improving Cyclists Training with Tactile Feedback on Feet BIBAKFull-Text 41-50
  Dominik Bial; Thorsten Appelmann; Enrico Rukzio; Albrecht Schmidt
This paper explores how tactile feedback can support cyclist in order to fulfill user-defined training programs. Therefore, actuators are integrated in cyclists' shoes. The rhythm the cyclist should pedal is communicated via tactile feedback so that the heart rate is kept in an interval which is, for example, optimal for increasing stamina. After a preliminary study, which was used to gather the optimal position for the actuators on feet, a working prototype of such a system was developed. This prototype was tested in a preliminary study by two participants in the wild. They were able to understand the communicated tactile feedback, enjoyed using our system and stated that they could imagine using such a system regularly. This indicates that communicating tactile feedback via the user's feet is another application domain where vibration signals can be of high benefit and can be used to communicate information to the user as audio or visual information are not appropriate.
Keywords: human computer interaction; tactile feedback; actuators; traffic; mobile phone; cyclists; prototype
HapticPulse -- Reveal Your Heart Rate in Physical Activities BIBAFull-Text 51-60
  Janko Timmermann; Benjamin Poppinga; Susanne Boll; Wilko Heuten
The heart rate is an objective parameter indicating the current physical activity. Displaying it to the user will help her or him to gain awareness of the physical load during certain activities. Current systems do not use the sense of touch to display the actual heart rate. Using the sense of touch has been shown to be potentially less distracting than using other senses in certain situations. In this paper we describe a system which displays the heart rate of the user using the sense of touch. We conducted a user study in the field with ten participants to collect qualitative and quantitative data, which serves as a guideline for the future improvement of such systems.
Audio-Haptic Simulation of Walking on Virtual Ground Surfaces to Enhance Realism BIBAFull-Text 61-70
  Niels C. Nilsson; Rolf Nordahl; Luca Turchet; Stefania Serafin
In this paper we describe two experiments whose goal is to investigate the role of physics-based auditory and haptic feedback provided at feet level to enhance realism in a virtual environment. To achieve this goal, we designed a multimodal virtual environment where subjects could walk on a platform overlooking a canyon. Subjects were asked to visit the environment wearing an head-mounted display and a custom made pair of sandals enhanced with sensors and actuators. A 12-channels surround sound system delivered a soundscape which was consistent with the visual environment. In the first experiment, passive haptics was provided by having a physical wooden platform present in the laboratory. In the second experiment, no passive haptics was present. In both experiments, subjects reported of having a more realistic experience while auditory and haptic feedback are present. However, measured physiological data and post-experimental presence questionnaire do not show significant differences when audio-haptic feedback is provided.

Object and Interface Interaction

Interacting with Deformable User Interfaces: Effect of Material Stiffness and Type of Deformation Gesture BIBAKFull-Text 71-80
  Johan Kildal
Deformable User Interfaces (DUIs) are increasingly being proposed for new tangible and organic interaction metaphors and techniques. To design DUIs, it is necessary to understand how deforming different materials manually using different gestures affects performance and user experience. In the study reported in this paper, three DUIs made of deformable materials with different levels of stiffness were used in navigation tasks that required bending and twisting the interfaces. Discrete and continuous deformation gestures were used in each case. Results showed that the stiffness of the material and the type of gesture affected performance and user experience in complex ways, but with a pervading pattern: using discrete gestures in very short navigation distances and continuous gestures otherwise, plus using lower-stiffness materials in every case, was beneficial in terms of performance and user experience.
Keywords: deformable; organic; tangible; user interface; force; bend; twist; zoom; scroll; stiffness; gesture; discrete; continuous; performance; UX
An Interactive and Multi-sensory Learning Environment for Nano Education BIBAKFull-Text 81-90
  Karljohan Lundin Palmerius; Gunnar Höst; Konrad Schönborn
Swift scientific advances in the area of nanoscience suggest that nanotechnology will play an increasingly important role in our everyday lives. Thus, knowledge of the principles underlying such technologies will inevitably be required to ensure a skilled industrial workforce. In this paper we describe the development of a virtual educational environment that allows for various direct interactive experiences and communication of nanophenomena to pupils and citizens, ranging from desktops to immersive and multi-sensory platforms. At the heart of the architecture is a nanoparticle simulator, which simulates effects such as short-range interaction, flexing of nanotubes and collisions with the solvent. The environment allows the user to interact with the particles to examine their behaviour related to fundamental science concepts.
Keywords: Learning Environment; Nanoscience concepts; Interaction; Multi-sensory; Haptics; Audio
Augmenting Media with Thermal Stimulation BIBAKFull-Text 91-100
  Martin Halvey; Michael Henderson; Stephen A. Brewster; Graham Wilson; Stephen A. Hughes
Thermal interfaces are a new area of research in HCI, with one of their main benefits being the potential to influence emotion. To date, studies investigating thermal feedback for affective interaction have either provided concepts and prototypes, or looked at the affective element of thermal stimuli in isolation. This research is the first to look in-depth at how thermal stimuli can be used to influence the perception of different media. We conducted two studies which looked at the effect of thermal stimuli on subjective emotional responses to media. In the first we presented visual information designed to evoke emotional responses in conjunction with different thermal stimuli. In the second we used different methods to present thermal stimuli in conjunction with music. Our results highlight the possibility of using thermal stimuli to create more affective interactions in a variety of media interaction scenarios.
Keywords: Thermal; stimulation; emotion; audio; visual; valence; arousal
Embodied Interactions with Audio-Tactile Virtual Objects in AHNE BIBAKFull-Text 101-110
  Koray Tahiroglu; Johan Kildal; Teemu Ahmaniemi; Simon Overstall; Valtteri Wikström
Interactive virtual environments are often focused on visual representation. This study introduces embodied and eyes-free interaction with audio-haptic navigation environment (AHNE) in a 3-dimensional space. AHNE is based on an optical tracking algorithm that makes use of Microsoft-Kinect and virtual objects are presented by dynamic audio-tactile cues. Users are allowed to grab and move the targets, enabled by a sensor located in a glove. To evaluate AHNE with users, an experiment was conducted. Users' comments indicated that sound cues elicited physical and visual experiences. Our findings suggest that AHNE could be a novel and fun interface to everyday resources in the environment such as a home audio system in the living room or a shopping list by fridge.
Keywords: AHNE; Audio-haptic; non-visual; embodied interaction; 3D UI; Reality Based Interaction; augmented reality

Test and Evaluation

Towards an Objective Comparison of Scanning-Based Interaction Techniques BIBAFull-Text 111-120
  Benjamin Poppinga; Martin Pielot; Wilko Heuten; Susanne Boll
The direction where a user points a mobile phone to can be measured with the phone's integrated compass. Pointing over time and with varying direction is often referred to as "scanning", which is an emerging interaction technique and increasingly applied in the field of mobile navigation and orientation. Because there is no need to look at the screen while scanning, often haptic or audio feedback is used. In fact there exist several different scanning-based interaction concepts. However, until now it is impossible to analyse and compare these techniques systematically to identify the best concept for a certain scenario. In this paper we investigated how our own Tactile Compass scanning technique has been used in a field study. Based on our observations we identified a set of measures, which we propose to become a standard set for the analysis and comparison of scan-based interaction techniques. We further argue that our contribution may be beneficial for the creation of guidelines and support designers in selecting a proper scan-based interaction technique.
Knocking Sound as Quality Sign for Household Appliances and the Evaluation of the Audio-Haptic Interaction BIBAKFull-Text 121-130
  M. Ercan Altinsoy
It has been known for a long time in the automobile industry that the first contact between the customer and a car in the showroom consists of opening the door, sitting in the car and closing the door. Therefore, the sounds of the door opening and closing are carefully designed to invoke feelings of high quality and safety in the customer. Of course, the vehicle's operating noises are equally crucial to the perception of overall quality.
   The operating noises of household appliances have gained increasing importance because these noises can negatively or positively influence our daily life. When shopping, customers consider the sound power level of the household appliance provided by the manufacturers. In most cases, it is not possible to listen to the machine in operation. However, a common practice of customers is to knock the sidewalls or open and close the doors of the machine. The knocking sound carries information about the quality and solidity of the product and its material properties. The perception of the knocking sound is normally coupled to a tactile/kinesthetic impression of the knocking event. The aims of this study are to identify the perceptually important features of the knocking sound that affect the impression of quality, define the guidelines for a target sound, make suggestions regarding structural modifications to realize the target sound, and investigate the interaction between auditory and haptic stimuli in the overall product-quality assessment. To achieve these aims, experiments with unimodal and multimodal stimulus presentations were conducted. The results showed that an optimal knocking sound is dull, moderately loud, atonal, and has no distinctive long-lasting frequency components, particularly at high frequencies. A quality index was proposed based on psychoacoustic metrics. The physical coupling between the sound and the vibrations causes that both sensory cues have similar effects on perceived quality.
Keywords: Household appliances; vibration of plates; product sound quality; impulsive sounds; auditory-haptic interaction
Spectral Discrimination Thresholds Comparing Audio and Haptics for Complex Stimuli BIBAFull-Text 131-140
  Lorenzo Picinali; Christopher Feakes; Davide A. Mauro; Brian F. G. Katz
Individuals with normal hearing are generally able to discriminate auditory stimuli that have the same fundamental frequency but different spectral content. This study concerns to what extent it is possible to perform the same differentiation considering vibratory tactile stimuli. Three perceptual experiments have been carried out in an attempt to compare discrimination thresholds in terms of spectral differences between auditory and vibratory tactile stimulations. The first test consists of assessing the subject's ability in discriminating between three signals with distinct spectral content. The second test focuses on the measurement of the discrimination threshold between a pure tone signal and a signal composed of two pure tones, varying the amplitude and frequency of the second tone. Finally, in the third test the discrimination threshold is measured between a tone with even harmonic components and a tone with odd ones. The results show that it is indeed possible to discriminate between haptic signals having the same fundamental frequency but different spectral. The threshold of sensitivity for detection is markedly less than for audio stimuli.
How Does Representation Modality Affect User-Experience of Data Artifacts? BIBAKFull-Text 141-151
  Trevor Hogan; Eva Hornecker
We present a study that explores people's affective responses when experiencing data represented through different modalities. In particular, we are interested in investigating how data representations that address haptic/tactile and sonic perception are experienced. We describe the creation of a number of data-driven artifacts that all represent the same dataset. In taking a phenomenological approach to our analysis, we used the Repertory Grid Technique (RGT) during a group session to elicit participant's personal constructs, which are used to describe and compare these artifacts. Our analysis examines these, traces the emergence of one exemplary personal construct and highlights other emergent themes. Our findings consist of a number of elicited constructs that illuminate how the affective qualities of data driven artifacts relate to the type of modality in use.
Keywords: Data Representation; Modality; Phenomenology; Repertory Grid Technique; User-Experience