Do That, There: An Interaction Technique for Addressing In-Air Gesture
Systems
In-Air Gesture
/
Freeman, Euan
/
Brewster, Stephen
/
Lantz, Vuokko
Proceedings of the ACM CHI'16 Conference on Human Factors in Computing
Systems
2016-05-07
v.1
p.2319-2331
© Copyright 2016 ACM
Summary: When users want to interact with an in-air gesture system, they must first
address it. This involves finding where to gesture so that their actions can be
sensed, and how to direct their input towards that system so that they do not
also affect others or cause unwanted effects. This is an important problem
which lacks a practical solution. We present an interaction technique which
uses multimodal feedback to help users address in-air gesture systems. The
feedback tells them how ("do that") and where ("there") to gesture, using
light, audio and tactile displays. By doing that there, users can direct their
input to the system they wish to interact with, in a place where their gestures
can be sensed. We discuss the design of our technique and three experiments
investigating its use, finding that users can "do that" well (93.2%-99.9%)
while accurately (51mm-80mm) and quickly (3.7s) finding "there".
Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces
Demonstrations
/
Freeman, Euan
/
Brewster, Stephen
/
Lantz, Vuokko
Proceedings of IFIP INTERACT'15: Human-Computer Interaction, Part IV
2015-09-14
v.4
p.478-481
Keywords: Above-device interaction; Gesture feedback; Gesture interaction; Interactive
light feedback; Mobile devices
© Copyright 2015 Springer International Publishing Switzerland
Summary: In-air hand gestures allow users to interact with mobile phones without
reaching out and touching them. Users need helpful and meaningful feedback
while they gesture, although mobile phones have limited feedback capabilities
because of their small screen sizes. Interactive light feedback illuminates the
surface surrounding a mobile phone, giving users visual feedback over a larger
area and without affecting on-screen content. We explore the design space for
interactive light and our demonstration shows how we can use this output
modality for gesture feedback.
Towards In-Air Gesture Control of Household Appliances with Limited Displays
Interactive Posters
/
Freeman, Euan
/
Brewster, Stephen
/
Lantz, Vuokko
Proceedings of IFIP INTERACT'15: Human-Computer Interaction, Part IV
2015-09-14
v.4
p.611-615
Keywords: In-air gestures; Household devices; Multimodal feedback
© Copyright 2015 Springer International Publishing Switzerland
Summary: Recent technologies allow us to interact with our homes in novel ways, such
as using in-air gestures for control. However, gestures require good feedback
and small appliances, like lighting controls and thermostats, have limited, or
no, display capabilities. Our research explores how other output types can be
used to give users feedback about their gestures, instead, allowing small
devices to give useful feedback. We describe the Gesture Thermostat, a
gesture-controlled thermostat dial which gives multimodal gesture feedback.
Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to
Touchless Interactions
Oral Session 5: Mobile and Urban Interaction
/
Freeman, Euan
/
Brewster, Stephen
/
Lantz, Vuokko
Proceedings of the 2014 International Conference on Multimodal Interaction
2014-11-12
p.419-426
© Copyright 2014 ACM
Summary: Above-device gesture interfaces let people interact in the space above
mobile devices using hand and finger movements. For example, users could
gesture over a mobile phone or wearable without having to use the touchscreen.
We look at how above-device interfaces can also give feedback in the space over
the device. Recent haptic and wearable technologies give new ways to provide
tactile feedback while gesturing, letting touchless gesture interfaces give
touch feedback. In this paper we take a first detailed look at how tactile
feedback can be given during above-device interaction. We compare approaches
for giving feedback (ultrasound haptics, wearables and direct feedback) and
also look at feedback design. Our findings show that tactile feedback can
enhance above-device gesture interfaces.
Squeezy bracelet: designing a wearable communication device for tactile
interaction
/
Pakanen, Minna
/
Colley, Ashley
/
Häkkilä, Jonna
/
Kildal, Johan
/
Lantz, Vuokko
Proceedings of the 8th Nordic Conference on Human-Computer Interaction
2014-10-26
p.305-314
© Copyright 2014 ACM
Summary: While smartphones are increasing in size and complex features, new form
factors for simple communication devices are emerging. In this paper, we
present the design process for a wrist worn communication device, which enables
the user to send text messages over a paired mobile phone. The process includes
concept design, user evaluation, design iteration, prototype implementation,
and evaluation of alternative interaction techniques. Our particular focus is
towards the use of naturally tactile interfaces in a wearable wristband form
factor. We present how users perceive deformable communication device concepts
and two alternative squeeze based interaction techniques.
Towards usable and acceptable above-device interactions
Poster Presentations
/
Freeman, Euan
/
Brewster, Stephen
/
Lantz, Vuokko
Proceedings of 2014 Conference on Human-Computer Interaction with Mobile
Devices and Services
2014-09-23
p.459-464
© Copyright 2014 ACM
Summary: Gestures above a mobile phone would let users interact with their devices
quickly and easily from a distance. While both researchers and smartphone
manufacturers develop new gesture sensing technologies, little is known about
how best to design these gestures and interaction techniques. Our research
looks at creating usable and socially acceptable above-device interaction
techniques. We present an initial gesture collection, a preliminary evaluation
of these gestures and some design recommendations. Our findings identify
interesting areas for future research and will help designers create better
gesture interfaces.
User experiences of mobile audio conferencing with spatial audio, haptics
and gestures
Oral session 2: communication
/
Rantala, Jussi
/
Müller, Sebastian
/
Raisamo, Roope
/
Suhonen, Katja
/
Väänänen-Vainio-Mattila, Kaisa
/
Lantz, Vuokko
Proceedings of the 2013 International Conference on Multimodal Interaction
2013-12-09
p.59-66
© Copyright 2013 ACM
Summary: Devices such as mobile phones have made it possible to take part in remote
audio conferences regardless of one's physical location. Mobile phones also
allow for new ways to interact with other conference participants. We present a
study on evaluating the user experiences of a mobile audio conferencing system
that was augmented with spatial audio, haptics, and gestures. In a user study
groups of participants compared the augmented audio conference based on a
mobile phone and headset to a traditional audio conference. The participants'
task was to use the two alternative systems in given discussion tasks. The
results of the subjective questionnaires showed that the augmented audio
conference was perceived as more stimulating (e.g. creative), while the
traditional audio conference was perceived as more practical (e.g.
straightforward). The results of the group interviews indicated that spatial
audio was the most desired feature, and that it had a positive effect on
participants' perception of the conversation. Based on our findings, guidelines
for the future development of similar systems are presented.
Fishing or a Z?: investigating the effects of error on mimetic and alphabet
device-based gesture interaction
Poster session
/
El Ali, Abdallah
/
Kildal, Johan
/
Lantz, Vuokko
Proceedings of the 2012 International Conference on Multimodal Interfaces
2012-10-22
p.93-100
© Copyright 2012 ACM
Summary: While gesture taxonomies provide a classification of device-based gestures
in terms of communicative intent, little work has addressed the usability
differences in manually performing these gestures. In this primarily
qualitative study, we investigate how two sets of iconic gestures that vary in
familiarity, mimetic and alphabetic, are affected under varying failed
recognition error rates (0-20%, 20-40%, 40-60%). Drawing on experiment logs,
video observations, subjects' feedback, and a subjective workload assessment
questionnaire, results revealed two main findings: a) mimetic gestures tend to
evolve into diverse variations (within the activities they mimic) under high
error rates, while alphabet gestures tend to become more rigid and structured
and b) mimetic gestures were tolerated under recognition error rates of up to
40%, while alphabet gestures incur significant overall workload with up to only
20% error rates. Thus, while alphabet gestures are more robust to recognition
errors in keeping their signature, mimetic gestures are more robust to
recognition errors from a usability and user experience standpoint, and thus
better suited for inclusion into mainstream device-based gesture interaction
with mobile phones.
Haptically augmented remote speech communication: a study of user practices
and experiences
Haptics and touch
/
Suhonen, Katja
/
Müller, Sebastian
/
Rantala, Jussi
/
Väänänen-Vainio-Mattila, Kaisa
/
Raisamo, Roope
/
Lantz, Vuokko
Proceedings of the 7th Nordic Conference on Human-Computer Interaction
2012-10-14
p.361-369
© Copyright 2012 ACM
Summary: Haptic technology provides a channel for interpersonal communication through
the sense of touch. In the development of novel haptic communication devices,
it is essential to explore people's use behaviors and perceptions of such a
communication channel. To this end, we conducted a laboratory study on
haptically augmented remote interpersonal communication. Participant pairs
tested a communication system that allowed them to send squeezing and thermal
feedback to each other's forearm during speech discussion. We explored the use
practices and user experience of this setup and compared it to traditional
speech-only communication. The findings indicate that squeezing was experienced
as a more versatile and immediate type of feedback than thermal feedback. Warm
and cold were on the other hand useful for communicating positive and negative
meanings. Compared to speech-only communication, the added haptic modality
allowed conveying emphases, emotions, and touches related to the discussion,
and increased the feeling of closeness between the pairs.
Pressages: augmenting phone calls with non-verbal messages
Tactile & grip
/
Hoggan, Eve
/
Stewart, Craig
/
Haverinen, Laura
/
Jacucci, Giulio
/
Lantz, Vuokko
Proceedings of the 2012 ACM Symposium on User Interface Software and
Technology
2012-10-07
v.1
p.555-562
© Copyright 2012 ACM
Summary: ForcePhone is a mobile synchronous haptic communication system. During phone
calls, users can squeeze the side of the device and the pressure level is
mapped to vibrations on the recipient's device. The pressure/vibrotactile
messages supported by ForcePhone are called pressages. Using a lab-based study
and a small field study, this paper addresses the following questions: how can
haptic interpersonal communication be integrated into a standard mobile device?
What is the most appropriate feedback design for pressages? What types of
non-verbal cues can be represented by pressages? Do users make use of pressages
during their conversations? The results of this research indicate that such a
system has value as a communication channel in real-world settings with users
expressing greetings, presence and emotions through pressages.
Augmented reality target finding based on tactile cues
Multimodal applications and techniques (poster)
/
Ahmaniemi, Teemu Tuomas
/
Lantz, Vuokko Tuulikki
Proceedings of the 2009 International Conference on Multimodal Interfaces
2009-11-02
p.335-342
Keywords: Fitts' Law, augmented reality, haptics, pointing
© Copyright 2009 ACM
Summary: This study is based on a user scenario where augmented reality targets could
be found by scanning the environment with a mobile device and getting a tactile
feedback exactly in the direction of the target. In order to understand how
accurately and quickly the targets can be found, we prepared an experiment
setup where a sensor-actuator device consisting of orientation tracking
hardware and a tactile actuator were used. The targets with widths 5ð,
10ð, 15ð, 20ð, and 25ð and various distances between each other
were rendered in a 90ð -wide space successively, and the task of the test
participants was to find them as quickly as possible. The experiment consisted
of two conditions: the first one provided tactile feedback only when pointing
was on the target and the second one included also another cue indicating the
proximity of the target. The average target finding time was 1.8 seconds. The
closest targets appeared to be not the easiest to find, which was attributed to
the adapted scanning velocity causing the missing the closest targets. We also
found that our data did not correlate well with Fitts' model, which may have
been caused by the non-normal data distribution. After filtering out 30% of the
least representative data items, the correlation reached up to 0.71. Overall,
the performance between conditions did not differ from each other
significantly. The only significant improvement in the performance offered by
the close-to-target cue occurred in the tasks where the targets where the
furthest from each other.
Hand gesture recognition and virtual game control based on 3D accelerometer
and EMG sensors
Short papers
/
Zhang, Xu
/
Chen, Xiang
/
Wang, Wen-hui
/
Yang, Ji-hai
/
Lantz, Vuokko
/
Wang, Kong-qiao
Proceedings of the 2009 International Conference on Intelligent User
Interfaces
2009-02-08
p.401-406
Keywords: accelerometer, electromyogram, gesture recognition, human computer
interaction
© Copyright 2009 ACM
Summary: This paper describes a novel hand gesture recognition system that utilizes
both multi-channel surface electromyogram (EMG) sensors and 3D accelerometer
(ACC) to realize user-friendly interaction between human and computers. Signal
segments of meaningful gestures are determined from the continuous EMG signal
inputs. Multi-stream Hidden Markov Models consisting of EMG and ACC streams are
utilized as decision fusion method to recognize hand gestures. This paper also
presents a virtual Rubik's Cube game that is controlled by the hand gestures
and is used for evaluating the performance of our hand gesture recognition
system. For a set of 18 kinds of gestures, each trained with 10 repetitions,
the average recognition accuracy was about 91.7% in real application. The
proposed method facilitates intelligent and natural control based on gesture
interaction.
Perception of dynamic audiotactile feedback to gesture input
Multimodal systems I (poster session)
/
Ahmaniemi, Teemu Tuomas
/
Lantz, Vuokko
/
Marila, Juha
Proceedings of the 2008 International Conference on Multimodal Interfaces
2008-10-20
p.85-92
Keywords: audio, gesture, haptics
© Copyright 2008 ACM
Summary: In this paper we present results of a study where perception of dynamic
audiotactile feedback to gesture input was examined. Our main motivation was to
investigate how users' active input and different modality conditions effect
the perception of the feedback. The experimental prototype in the study was a
handheld sensor-actuator device that responds dynamically to user's hand
movements creating an impression of a virtual texture. The feedback was
designed so that the amplitude and frequency of texture were proportional to
the overall angular velocity of the device. We used four different textures
with different velocity responses. The feedback was presented to the user by
the tactile actuator in the device, by audio through headphones, or by both.
During the experiments, textures were switched in random intervals and the task
of the user was to detect the changes while moving the device freely. The
performances of the users with audio or audiotactile feedback were quite equal
while tactile feedback alone yielded poorer performance. The texture design
didn't influence the movement velocity or periodicity but tactile feedback
induced most and audio feedback the least energetic motion. In addition,
significantly better performance was achieved with slower motion. We also found
that significant learning happened over time; detection accuracy increased
significantly during and between the experiments. The masking noise used in
tactile modality condition did not significantly influence the detection
accuracy when compared to acoustic blocking but it increased the average
detection time.
Dynamic audiotactile feedback in gesture interaction
Short papers
/
Ahmaniemi, Teemu
/
Lantz, Vuokko
/
Marila, Juha
Proceedings of 10th Conference on Human-computer interaction with mobile
devices and services
2008-09-02
p.339-342
Keywords: audio, gesture interaction, haptics
© Copyright 2008 ACM
Summary: Proper feedback is one of the challenges in gesture interaction. Providing
continuous feedback during the execution of the gesture increases the feeling
of control and it can help user to perform the task more efficiently. In this
paper we introduce an experimental handheld sensor-actuator device that
responds dynamically to user's motion. With the device we explored the
potential of continuous audiotactile feedback in closed-loop gesture
interaction, designed simple synthesis methods for feedback, and tested the
user perception. We designed four simple textures that respond to overall
angular velocity of the device, all with different velocity responses. The
system enabled us to examine how well subjects can distinguish the textures on
the fly. Our preliminary findings show that audio modality dominates the
perception. Tactile feedback worked quite well alone but the modalities
together didn't lead to any better performance than audio alone.
Stroke break analysis: a practical method to study timeout value for
handwriting recognition input
Fact finding
/
Cui, Yanqing
/
Lantz, Vuokko
Proceedings of 7th Conference on Human-computer interaction with mobile
devices and services
2005-09-19
p.263-266
© Copyright 2005 ACM
Summary: Handwriting recognition (HWR) input method has been considered to be one of
the most usable text entry methods for handheld devices, especially for
languages with large and complicated character sets such as Chinese. The paper
studies stroke break times within handwritten characters and presents a new
method for setting HWR timeout by examining the break time distributions. For
multi-stroke character HWR input, a timeout is widely used as a segmentation
technique to initiate the recognition process. In this paper, we examine the
largest stroke break time in each character and explore the relationship
between break time distribution and optimal HWR timeout. The study used Chinese
as test material and the test independent variables were writing condition
(input box, full screen) and user's posture while they were writing (hold
device in hand, keep device on table). The main findings are: (1) the stroke
break times are similar in full screen and input box conditions, though the
users tend to write larger characters in full screen condition. (2) The stroke
break times fit into a tight distribution. It is feasible to estimate optimal
HWR timeout by studying stoke break time distribution. A nonparametric
histogram method was used to model the stroke break distributions and it showed
that typical Chinese HWR default timeouts are around 99% percentile in the
distribution. (3) Differences in HWR stroke break distributions are very
significant between individual users. The stroke break time analysis can also
be applied to design HWR timeout customization scale.
Rhythmic interaction with a mobile device
Interaction techniques and devices
/
Lantz, Vuokko
/
Murray-Smith, Roderick
Proceedings of the Third Nordic Conference on Human-Computer Interaction
2004-10-23
p.97-100
© Copyright 2004 ACM
Summary: We describe a rhythmic interaction mechanism for mobile devices. A PocketPC
with a three degree of freedom linear acceleration meter is used as the
experimental platform for data acquisition. Dynamic Movement Primitives are
used to learn the limit cycle behavior associated with the rhythmic gestures.
We outline the open technical and user experience challenges in the development
of usable rhythmic interfaces.