| Self-produced Sound: Tightly Binding Haptics and Audio | | BIBAK | Full-Text | 1-8 | |
| James A. Ballas | |||
| This paper discusses the concept of self-produced sound and its importance
in understanding audio-haptic interaction. Self-produced sound is an important
stimulus in understanding audio-haptic interaction because of the tight binding
between the two modalities. This paper provides background on this type of
sound, a brief review of the asynchrony and neurophysiology research that has
addressed the cross-modality interaction, and examples of research into
self-produced sound, including a unique but common instance: sound produced
when consuming food. Keywords: Haptics; self-produced sound; hearing; psychoacoustics | |||
| Will Haptics Technology Be Used in Mobile Devices?: A Historical Review of Haptics Technology and Its Potential Applications in Multi-modal Interfaces | | BIBA | Full-Text | 9-10 | |
| Dong-Soo Kwon | |||
| In recent years, the haptics research area has become an interdisciplinary field covering perception, psychophysics, neuroscience, mechanism design, control, virtual reality, and human computer interaction. If we try to identify the origins of haptics research, it can be said to have emerged from the teleoperator systems of the late 1940s. In these initial explorations, increasing the transparency level of the mechanical master/slave manipulator system was the main issue as such improvements promised higher levels of task efficiency. For example, in order to handle nuclear materials effectively inside a radiation shielded room, minimizing friction and the effects of inertia in a mechanical master/slave system was the critical factor. Furthermore, when teleoperator systems were designed for hazardous environments and long distance space applications, establishing stability in the face of lengthy (and often uncertain) time delays was the key issue. Ergonomic design of the remote control console and the master haptic device also exerted a strong influence on how effectively remote information could be displayed to enhance telepresence. | |||
| Tactile Visualization with Mobile AR on a Handheld Device | | BIBA | Full-Text | 11-21 | |
| Beom-Chan Lee; Hyeshin Park; Junhun Lee; Jeha Ryu | |||
| This paper presents a tactile visualization system incorporating touch feedback to a mobile AR system realized on a handheld device. This system enables, for the first time, interactive haptic feedback though mobile and wearable interfaces. To demonstrate the proposed concept, an interactive scenario that helps a visually impaired user to recognize specific pictograms has been constructed. This system allows users to tactually recognize flat pictograms situated in the real world. Furthermore, it also opens the door to a wide range of applications which could be based on wearable tactile interaction. | |||
| Mobile Multi-actuator Tactile Displays | | BIBAK | Full-Text | 22-33 | |
| Eve Hoggan; Sohail Anwar; Stephen A. Brewster | |||
| The potential of using the sense of touch to communicate information in
mobile devices is receiving more attention because of the limitations of
graphical displays in such situations. However, most applications only use a
single actuator to present vibrotactile information. In an effort to create
richer tactile feedback and mobile applications that make use of the entire
hand and multiple fingers as opposed to a single fingertip, this paper presents
the results of two experiments investigating the perception and application of
multi-actuator tactile displays situated on a mobile device. The results of
these experiments show that an identification rate of over 87% can be achieved
when two dimensions of information are encoded in Tactons using rhythm and
location. They also show that location produces 100% recognition rates when
using actuators situated on the mobile device at the lower thumb, upper thumb,
index finger and ring finger. This work demonstrates that it is possible to
communicate information through four locations using multiple actuators
situated on a mobile device when non-visual information is required. Keywords: Multimodal Interaction; Haptic I/O; Tactile Icons (Tactons); Mobile
Displays; Multi-Actuator Displays | |||
| Comparison of Force, Tactile and Vibrotactile Feedback for Texture Representation Using a Combined Haptic Feedback Interface | | BIBAK | Full-Text | 34-43 | |
| Ki-Uk Kyung; Jun-Young Lee; Jun-Seok Park | |||
| In this paper, we compared force feedback, tactile feedback and vibration
feedback for texture display. For this investigation, a pen-like haptic
interface with a built-in compact tactile display and a vibrating module was
developed. The handle of pen held haptic interface was replaced by the pen-like
interface to add tactile feedback capability to the device. Since the system
provides combination of force and tactile feedback, three haptic representation
methods have been compared on surface with 3 texture groups which differ in
direction, groove width and shape. Over all the tests, the haptic device with
combined with the built-in compact tactile display showed satisfactory results.
Vibration feedback was also reasonably effective in texture display. From the
series of experiments, applicability of the compact tactile display and
usability of pen-like haptic interface in a pen held hapic interface have been
verified. Keywords: texture; combination; force; tactile; vibration | |||
| Shake2Talk: Multimodal Messaging for Interpersonal Communication | | BIBAK | Full-Text | 44-55 | |
| Lorna M. Brown; John Williamson | |||
| This paper explores the possibilities of using audio and haptics for
interpersonal communication via mobile devices. Drawing on the literature on
current messaging practises, a new concept for multimodal messaging has been
designed and developed. The Shake2Talk system allows users to construct
audio-tactile messages through simple gesture interactions, and send these
messages to other people. Such messages could be used to communicate a range of
meanings, from the practical (e.g. "home safely", represented by the sound and
sensation of a key turning in a lock) to the emotional (e.g. "thinking of you"
represented by a heartbeat). This paper presents the background to this work,
the system design and implementation and a plan for evaluation. Keywords: haptics; audio; vibrotactile; multimodal interaction; mobile phones;
messaging; remote communication; gesture recognition | |||
| Communication-Wear: User Feedback as Part of a Co-Design Process | | BIBAK | Full-Text | 56-68 | |
| Sharon Baurley; Philippa Brock; Erik Geelhoed; Andrew Moore | |||
| Communication-Wear is a clothing concept that augments the mobile phone by
enabling expressive messages to be exchanged remotely, by conveying a sense of
touch, and presence. It proposes to synthesise conventions and cultures of
fashion with those of mobile communications, where there are shared attributes
in terms of communication and expression. Using garment prototypes as research
probes as part of an on-going iterative co-design process, we endeavoured to
mobilise participants' tacit knowledge in order to gauge user perceptions on
touch communication in a lab-based trial. The aim of this study was to
determine whether established sensory associations people have with the tactile
qualities of textiles could be used as signs and metaphors for experiences,
moods, social interactions and gestures, related to interpersonal touch. The
findings are used to inspire new design ideas for textile actuators for use in
touch communication in successive iterations. Keywords: Smart textiles; wearable technology; touch communication; clothing and
emotion; user research; prototype as probe | |||
| Interactive Racing Game with Graphic and Haptic Feedback | | BIBAK | Full-Text | 69-77 | |
| Sang-Youn Kim; Kyu-Young Kim | |||
| This paper proposes a mobile racing game prototype system where a player
haptically senses the state of a car and the road condition with a vibrotactile
signal generation method. The vibrotactile signal generation method provides
variable vibrotactile effects according to a user's interaction with the
graphic environment. The generated vibrotactile effects are used for the input
of an eccentric vibration motor and a solenoid actuator in order to convey
vibrotactile information with a large bandwidth to the players. To evaluate the
proposed racing game, six persons experience two kinds of racing game; one with
vibrotactile feedback, the other without vibrotactile feedback. The experiment
shows that the proposed game with vibrotactile feedback provides players with
increased levels of realism and immersion. Keywords: Vibrotactile; Haptic; Racing game | |||
| Obstacle Detection and Avoidance System for Visually Impaired People | | BIBA | Full-Text | 78-85 | |
| Byeong-Seok Shin; Cheol-Su Lim | |||
| In this paper, we implemented a wearable system for visually impaired users which allows them to detect and avoid obstacles. This is based on ultrasound sensors which can acquire range data from objects in the environment by estimating the time-of-flight of the ultrasound signal. Using a hemispherical sensor array, we can detect obstacles and determine which directions should be avoided. However, the ultrasound sensors are only used to detect whether obstacles are present in front of users. We determine unimpeded directions by analyzing patterns of the range values from consecutive frames. Feedback is presented to users in the form of voice commands and vibration patterns. Our system is composed of an ARM9-based embedded system, an ultrasonic sensor array, an orientation tracker and a set of vibration motors with controller. | |||
| Tangible User Interface for the Exploration of Auditory City Maps | | BIBAK | Full-Text | 86-97 | |
| Martin Pielot; Niels Henze; Wilko Heuten; Susanne Boll | |||
| Before venturing out into unfamiliar areas, most people scope out a map. But
for the blind or visually impaired traditional maps are not accessible. In our
previous work, we developed the "Auditory Map" which conveys the location of
geographic objects through spatial sonification. Users perceive these objects
through the virtual listener's ears walking through the presented area.
Evaluating our system we observed that the participants had difficulties
perceiving the directions of geographic objects accurately. To improve the
localization we introduce rotation to the Auditory Map. Rotation is difficult
to achieve with traditional input devices such as a mouse or a digitizer
tablet. This paper describes a tangible user interface which allows rotating
the virtual listener using physical representations of the map and the virtual
listener. First evaluation results show that our interaction technique is a
promising approach to improve the construction of cognitive maps for visually
impaired people. Keywords: sonification; auditory display; tangible user interface; spatial audio;
exploration; interaction techniques; visually impaired users | |||
| Haptic and Sound Grid for Enhanced Positioning in a 3-D Virtual Environment | | BIBAK | Full-Text | 98-109 | |
| Seung-Chan Kim; Dong-Soo Kwon | |||
| As images are projected onto the flat retina when identifying objects
scattered in space, there may be considerable ambiguity in depth (i.e.
z-direction) perception. Therefore, position information can be distorted,
especially along the z-axis. In this paper, virtual grids using haptic and
auditory feedback are proposed to complement ambiguous visual depth cues. This
study experimentally investigates the influence of virtual grids on position
identification in a 3-D workspace. A haptic grid is generated using the
PHANTOM® Omni™ and a sound grid is generated by changing the
frequency characteristics of the sound source based on the hand movement of the
operator. Both grids take the form of virtual planes placed at regular
intervals of 10mm through three axes (i.e. x, y, and z). The haptic and sound
grids are conveyed to subjects separately or simultaneously according to test
conditions. In cases of bimodal presentation, the grids are displayed with
cross-modal synchrony. The statistically significant results indicate that the
presence of the grid in space increased the average values of precision. In
particular, errors in the z-axis decreased by more than 50% (F=19.82,
p<0.01). Keywords: grid plane; depth ambiguity; haptic; auditory; multimodality | |||
| User-Centered Design Proposals for Prototyping Haptic User Interfaces | | BIBAK | Full-Text | 110-120 | |
| Hans V. Bjelland; Kristian Tangeland | |||
| The range of applications for haptic user interfaces is wide, but although
haptics offer unique qualities to user interfaces, the rate of adoption and
implementation of haptics in commercialized products is relatively low. The
challenges of building low-cost flexible prototypes with haptics in the early
stages of product development are believed to be a contributing factor to this.
This paper addresses these specific challenges in relation to the user-centered
design process. A case where prototypes were used in the early project stage is
presented as an example of possibilities of prototyping haptic feedback.
Finally, general recommendations for how to prototype haptic user interfaces
that support both technological development and usability are listed. These are
comprised by: 1) Build on the tradition of user-centered design, 2) Prototype
from day one, 3) Substitute technology, 4) Build several different prototypes,
5) Develop a vocabulary, 6) Stick with the heuristics. These recommendations
can contribute to a better understanding of how haptics can be handled in the
design process as well as guide future haptic research. Keywords: prototyping; haptic user interfaces; user-centered design | |||
| Designing Eyes-Free Interaction | | BIBAK | Full-Text | 121-132 | |
| Ian Oakley; Jun-Seok Park | |||
| As the form factors of computational devices diversify, the concept of
eyes-free interaction is becoming increasingly relevant: it is no longer hard
to imagine use scenarios in which screens are inappropriate. However, there is
currently little consensus about this term. It is regularly employed in
different contexts and with different intents. One key consequence of this
multiplicity of meanings is a lack of easily accessible insights into how to
best build an eyes-free system. This paper seeks to address this issue by
thoroughly reviewing the literature, proposing a concise definition and
presenting a set of design principles. The application of these principles is
then elaborated through a case study of the design of an eyes-free motion input
system for a wearable device. Keywords: Eyes-free interaction; design principles; motion input | |||
| Beyond Clicks and Beeps: In Pursuit of an Effective Sound Design Methodology | | BIBA | Full-Text | 133-144 | |
| Antti Pirhonen; Kai Tuuri; Manne-Sakari Mustonen; Emma Murphy | |||
| Designing effective non-speech audio elements for a user-interface is a challenging task due to the complex nature of sounds and the changing contexts of non-visual interfaces. In this paper we present a design method, which is intended to take into account the complexity of audio design as well as the existing audio environment and the functional context of use. Central to this method is a rich use scenario, presented in the form of a radio play, which is used as a basis for the work of design panels. A previous version of the design method is analysed and specific practical issues are identified. Solutions to these issues are presented in the form of a modified version of the method. In the current version of the method, special attention has been paid to the development of a rich use scenario and the underlying personage. A case study is presented to illustrate the practical implementation of the modified design method and to support the proposed guidelines for its use. | |||