Eye Movement Biometrics on Wearable Devices: What Are the Limits?
Late-Breaking Works: Engineering of Interactive Systems
/
Abdulin, Evgeniy
/
Rigas, Ioannis
/
Komogortsev, Oleg
Extended Abstracts of the ACM CHI'16 Conference on Human Factors in
Computing Systems
2016-05-07
v.2
p.1503-1509
© Copyright 2016 ACM
Summary: This paper presents a study of the perspectives of eye tracking on wearable
devices and their use to perform eye movement biometrics. In such devices, the
reduction in power consumption is very important, and can be partially achieved
by reducing the size of the eye-tracking imaging sensor. In this preliminary
work, we conduct two experiments: first we investigate the limits of the
captured eye-image resolution to achieve acceptable eye-tracking precision, and
then, we explore the effects from degradation in precision, simulated via the
addition of dithering noise, on the applied scenario of eye movement
biometrics. Our results provide detailed insights for the expected behavior of
eye movement biometrics in resource-constraint systems.
Confirmation Responses: In-context, Visible, & Predictable Design versus
Popup Windows
Late-Breaking Works: Usable, Useful, and Desirable
/
Abdulin, Evgeniy
/
Billman, Dorrit
Extended Abstracts of the ACM CHI'16 Conference on Human Factors in
Computing Systems
2016-05-07
v.2
p.2969-2975
© Copyright 2016 ACM
Summary: Protecting users and the systems they use from slips is an open problem,
which popup windows have not solved. We propose a new approach to action
confirmation or cancellation based on three key design principles: in-context
presentation, predictability, and visibility. In an experiment that compared an
initial design based on this approach to traditional popup windows, this design
reduced execution time without significant increase in slips. Providing
information in accord with the proposed principles and application to
preventing slips should be explored in future research.
Detecting the onset of eye fatigue in a live framework
Video & demo abstracts
/
Lohr, Dillon J.
/
Abdulin, Evgeny
/
Komogortsev, Oleg V.
Proceedings of the 2016 Symposium on Eye Tracking Research &
Applications
2016-03-14
p.315-316
© Copyright 2016 ACM
Summary: This document describes a method for detecting the onset of eye fatigue and
how it could be implemented in an existing live framework. The proposed method,
which uses fixation data, does not rely as heavily on the sampling rate of the
eye tracker as do methods which use saccade data, making it more suitable for
lower cost eye trackers such as mobile and wearable devices. By being able to
detect eye fatigue with such eye trackers, it becomes possible to react to the
development of fatigue in virtually any environment, such as by alerting
drivers that they appear fatigued and may want to pull over. It could also be
used to aid in developing interfaces that are more user-friendly by noting at
which point a user becomes fatigued while navigating the interface.
User Eye Fatigue Detection via Eye Movement Behavior
WIP Theme: Gesture and Multimodal
/
Abdulin, Evgeniy
/
Komogortsev, Oleg
Extended Abstracts of the ACM CHI'15 Conference on Human Factors in
Computing Systems
2015-04-18
v.2
p.1265-1270
© Copyright 2015 ACM
Summary: In this study we propose and evaluate a novel approach that allows detection
of physical eye fatigue. The proposed approach is based on the analysis of the
recorded eye movements via what is called behavioral scores. These
easy-to-compute scores can be obtained immediately after a calibration
procedure, via processing of such basic eye movements as fixations and saccades
extracted from the raw eye positional data recorded by an eye tracker. The
results, based on the data from 36 volunteers indicate that one of the
behavioral scores, Fixational Qualitative Score, is more sensitive to the onset
of eye fatigue than already established methods based on saccadic
characteristics only.
Using the keystroke-level model for designing user interface on middle-sized
touch screens
Text entry & typing
/
Abdulin, Evgeniy
Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems
2011-05-07
v.2
p.673-686
© Copyright 2011 ACM
Summary: The Keystroke-Level Model was developed to predict accurately task execution
time for mouse-and-keyboard systems. Middle-sized touch screens are becoming
much more popular so it is important to determine whether KLM can provide
useful predictions for these interfaces as well. The KLMs were created using
special software CogTool for three touch screen interfaces for integrated
control systems and were compared to experimental data. The results showed that
the KLM prediction error for middle-sized touch screens reached less than 5%.
This conclusion is that KLM has acceptable accuracy level in this environment
for making predictions for the task execution times.