HCI Bibliography Home | HCI Conferences | HAID Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
HAID Tables of Contents: 0607080910111213

HAID 2008: International Workshop on Haptic and Audio Interaction Design

Fullname:HAID 2008: Haptic and Audio Interaction Design: Third International Workshop
Editors:Antti Pirhonen; Stephen Brewster
Location:Jyväskylä, Finland
Dates:2008-Sep-15 to 2008-Sep-16
Publisher:Springer Berlin Heidelberg
Series:Lecture Notes in Computer Science 5270
Standard No:DOI: 10.1007/978-3-540-87883-4; ISBN: 978-3-540-87882-7 (print), 978-3-540-87883-4 (online); hcibib: HAID08
Papers:13
Pages:129
Links:Online Proceedings
  1. Visual Impairment
  2. Applications of Multimodality
  3. Evaluation
  4. Conceptual Integration of Audio and Haptics
  5. Interaction Techniques
  6. Perception

Visual Impairment

Evaluation of Continuous Direction Encoding with Tactile Belts BIBAKFull-Text 1-10
  Martin Pielot; Niels Henze; Wilko Heuten; Susanne Boll
Tactile displays consisting of tactors located around the user's waist are a proven means for displaying directions in the horizontal plane. These displays use the body location of tactors to express directions. In current implementations the number of directions that can be expressed is limited to the number of tactors. However, the required number of tactors might not be available or their configuration requires too much effort. This paper describes the design and the evaluation of a presentation method that allows displaying direction between tactors by interpolated their intensity. We compare this method with the prevalent one by letting participants determine directions and having them navigate along tactile waypoints in a virtual environment. The interpolated direction presentation significantly improved the accuracy of perceived directions. Discrete direction presentation, however, proved to be better suited for waypoint navigation and was found easier to process.
Keywords: multimodal user interfaces; tactile displays; direction presentation; interpolation; orientation and navigation
Supporting Collaboration between Visually Impaired and Sighted Children in a Multimodal Learning Environment BIBAKFull-Text 11-20
  Erika Tanhua-Piiroinen; Virpi Pasto; Roope Raisamo; Eva-Lotta Sallnäs
Visually impaired pupils are a group that teachers need to pay attention to especially when planning group work. The need for supporting collaboration between visually impaired and sighted people has been pointed out but still there are few evaluations on that. In this paper two studies are described concerning collaboration support for visually impaired and sighted children in a multimodal learning environment. Based on the results of the first study where two children used a multimodal single-user Space application together, the application was improved to better support collaboration. This prototype was then evaluated. According to the results it is worthwhile to provide individual input devices for all the participants in the group. For helping the pupils to achieve a common ground it is also important to provide sufficient support for all senses in a multimodal environment and to take care of the feedback about the haptic status of the environment also for the sighted participants.
Keywords: Collaboration; visually impaired children; multimodal interaction; haptics

Applications of Multimodality

Perceptually Informed Roles for Haptic Feedback in Expressive Music Controllers BIBAKFull-Text 21-29
  Ricardo Pedrosa; Karon MacLean
In this paper, we propose a methodology for systematically integrating haptic feedback with a co-developed gesture interface for a computer-based music instrument. The primary goal of this research is to achieve an increased understanding of how different sub-modalities of haptic feedback should be combined to support both controllability and comfort in expressive interfaces of this type. We theorize that when including haptic feedback in an instrument, force and vibrotactile feedback could be beneficially designed individually and then fine-tuned when mixed in the final design.
Keywords: Haptics; gesture interface; perception
Real-Time Gesture Recognition, Evaluation and Feed-Forward Correction of a Multimodal Tai-Chi Platform BIBAKFull-Text 30-39
  Otniel Portillo-Rodriguez; Oscar O. Sandoval-Gonzalez; Emanuele Ruffaldi; Rosario Leonardi; Carlo Alberto Avizzano; Massimo Bergamasco
This paper presents a multimodal system capable to understand and correct in real-time the movements of Tai-Chi students through the integration of audio-visual-tactile technologies. This platform acts like a virtual teacher that transfers the knowledge of five Tai-Chi movements using feed-back stimuli to compensate the errors committed by a user during the performance of the gesture. The fundamental components of this multimodal interface are the gesture recognition system (using k-means clustering, Probabilistic Neural Networks (PNN) and Finite State Machines (FSM)) and the real-time descriptor of motion which is used to compute and qualify the actual movements performed by the student respect to the movements performed by the master, obtaining several feedbacks and compensating this movement in real-time varying audio-visualtactile parameters of different devices. The experiments of this multimodal platform have confirmed that the quality of the movements performed by the students is improved significantly.
Keywords: Multimodal Interfaces; real-time 3D time-independent gesture recognition; real-time descriptor; vibrotactile feedback; audio-position feedback; Virtual Realty and Skills transfer
A System for Multimodal Exploration of Social Spaces BIBAKFull-Text 40-49
  Victor V. Kryssanov; Shizuka Kumokawa; Igor Goncharenko; Hitoshi Ogawa
This paper describes a system developed to help people explore local communities by providing navigation services in social spaces created by members of the communities. Just as a community's social space is formed by communication and knowledge-sharing practices, the proposed system utilizes data of the corresponding social network to reconstruct the social space, which is otherwise not physically perceptible but imaginary and yet experiential and learnable. The social space is modeled with an agent network, where each agent stands for a member of the community and has knowledge about expertise and personal characteristics of other members. An agent can gather information, using its social "connections," to find community members most suitable to communicate to in a specific situation defined by the system's user. The system then deploys its multimodal interface, which operates with 3D graphics and haptic virtual environments and "maps" the social space onto a representation of the relevant physical space, to advise the user on an efficient communication strategy for the given community. A prototype of the system is built and used in a pilot study. The study results are briefly discussed, conclusions are drawn, and implications for future work are formulated.
Keywords: Social navigation; agent network; multimodal interface

Evaluation

Towards Haptic Performance Analysis Using K-Metrics BIBAKFull-Text 50-59
  Richard Hall; Hemang Rathod; Mauro Maiorca; Ioanna Ioannou; Edmund Kazmierczak; Stephen O'Leary; Peter Harris
It is desirable to automatically classify data samples for the assessment of quantitative performance of users of haptic devices as the haptic data volume may be much higher than is feasible to manually annotate. In this paper we compare the use of three k-metrics for automated classification of human motion: cosine, extrinsic curvature and symmetric centroid deviation. Such classification algorithms make predictions about data attributes, whose quality we assess via three mathematical methods of comparison: root mean square deviation, sensitivity error and entropy correlation coefficient. Our assessment suggests that k-cosine might be more promising at analysing haptic motion than our two other metrics.
Keywords: Haptic performance analysis; motion classification
Multimodal Interaction: Real Context Studies on Mobile Digital Artefacts BIBAKFull-Text 60-69
  Tiago Reis; Marco de Sá; Luís Carriço
The way users interact with mobile applications varies according to the context where they are. We conducted a study where users had to manipulate a multimodal questionnaire in 4 different contexts (home, park, subway and driving), considering different variables (lighting, noise, position, movement, type of content, number of people surrounding the user and time constraints) that affect interaction. This study aimed at understanding the effect of the context variables in users' choices regarding the interaction modalities available (voice, gestures, etc). We describe the results of our study, eliciting situations where users adopted specific modalities and the reasons for that. Accordingly, we draw conclusions on users' preferences regarding interaction modalities on real life contexts.
Keywords: Multimodal Interaction; Mobile Devices; Studies in Real Contexts

Conceptual Integration of Audio and Haptics

An Audio-Haptic Aesthetic Framework Influenced by Visual Theory BIBAKFull-Text 70-80
  Angela Chang; Conor O'Sullivan
Sound is touch at a distance. The vibration of pressure waves in the air creates sounds that our ears hear, at close range, these pressure waves may also be felt as vibration. This audio-haptic relationship has potential for enriching interaction in human-computer interfaces. How can interface designers manipulate attention using audio-haptic media? We propose a theoretical perceptual framework for design of audio-haptic media, influenced by aesthetic frameworks in visual theory and audio design. The aesthetic issues of the multimodal interplay between audio and haptic modalities are presented, with discussion based on anecdotes from multimedia artists. We use the aesthetic theory to develop four design mechanisms for transition between audio and haptic channels:synchronization, temporal linearization, masking and synchresis. An example composition using these mechanisms, and the multisensory design intent, is discussed by the designers.
Keywords: Audio-haptic; multimodal design; aesthetics; musical expressivity; mobile; interaction; synchronization; linearization; masking; synchresis
In Search for an Integrated Design Basis for Audio and Haptics BIBAKFull-Text 81-90
  Antti Pirhonen; Kai Tuuri
Audio and haptics as interaction modalities share properties, which make them highly appropriate to be handled within a single conceptual framework. This paper outlines such framework, gaining ingredients from the literature concerning cross-modal integration and embodied cognition. The resulting framework is bound up with a concept of physical embodiment, which has been introduced within several scientific disciplines to reveal the role of bodily experience and the corresponding mental imagery as the core of meaning-creation. In addition to theoretical discussion, the contribution of the proposed approach in design is outlined.
Keywords: haptics; audio; integration; multimodal; embodiment

Interaction Techniques

tacTiles for Ambient Intelligence and Interactive Sonification BIBAFull-Text 91-101
  Thomas Hermann; Risto Kõiva
In this paper we introduce tacTiles, a novel wireless modular tactile sensitive surface element attached to a deformable textile, designed as a lay-on for surfaces such as chairs, sofas, floor or other furniture. tacTiles can be used as interface for human-computer interaction or ambient information systems. We give a full account on the hardware and show applications that demonstrate real-time sonification for process monitoring and biofeedback. Finally we sketch ideas for using tacTiles paired with sonification for interaction games.
An Audio-Haptic Interface Concept Based on Depth Information BIBAKFull-Text 102-110
  Delphine Devallez; Davide Rocchesso; Federico Fontana
We present an interaction tool based on rendering distance cues for ordering sound sources in depth. The user interface consists of a linear position tactile sensor made by conductive material. The touch position is mapped onto the listening position on a rectangular virtual membrane, modeled by a bidimensional Digital Waveguide Mesh and providing distance cues. Spatialization of sound sources in depth allows a hierarchical display of multiple audio streams, as in auditory menus. Besides, the similar geometries of the haptic interface and the virtual auditory environment allow a direct mapping between the touch position and the listening position, providing an intuitive and continuous interaction tool for auditory navigation.
Keywords: Audio-haptic interface; auditory navigation; distance perception; spatialization; digital waveguide mesh; virtual environment

Perception

Crossmodal Rhythm Perception BIBAKFull-Text 111-119
  Maria Jokiniemi; Roope Raisamo; Jani Lylykangas; Veikko Surakka
Research on rhythm perception has mostly been focused on the auditory and visual modalities. Previous studies have shown that the auditory modality dominates rhythm perception. Rhythms can also be perceived through the tactile senses, for example, as vibrations, but only few studies exist. We investigated unimodal and crossmodal rhythm perception with auditory, tactile, and visual modalities. Pairs of rhythm patterns were presented to the subject who made a same-different judgment. We used all possible combinations of the three modalities. The results showed that the unimodal auditory condition had the highest rate (79.2%) of correct responses. The unimodal tactile condition (75.0%) and the auditory-tactile condition (74.2%) were close. The average rate remained under 61.7% when the visual modality was involved. The results confirmed that auditory and tactile modalities are suitable for presenting rhythmic information, and they are also preferred by the users.
Keywords: Crossmodal interaction; auditory interaction; tactile interaction; visual interaction; rhythm perception
The Effect of Auditory Cues on the Audiotactile Roughness Perception: Modulation Frequency and Sound Pressure Level BIBAKFull-Text 120-129
  M. Ercan Altinsoy
Scraping a surface with the finger tip is a multimodal event. We obtain information about the texture, i.e. roughness of the surface, at least through three different sensory channels, i.e. auditory, tactile and visual. People are highly skilled in using touch-produced sounds to identify texture properties. Sound pressure level, modulation frequency and pitch of the touch-induced scraping sounds are the important psychoacoustical determinants of the texture roughness perception. In this study, psychophysical experiments were conducted to investigate what are the relative contributions of the auditory and tactile sensory modalities to the multimodal (audiotactile) roughness percept?, what are the effects of the perceptual discrepancy between the modalities on the multimodal roughness judgment and how different modulation frequency and loudness conditions affect the subjects' roughness perception.
Keywords: Multimodal interaction; roughness; texture perception; auditory; haptic