HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,242,745
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Akkil_D* Results: 8 Sorted by: Date  Comments?
Help Dates
Limit:   
Gaze Augmentation in Egocentric Video Improves Awareness of Intention VR for Collaboration / Akkil, Deepak / Isokoski, Poika Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.1573-1584
ACM Digital Library Link
Summary: Video communication using head-mounted cameras could be useful to mediate shared activities and support collaboration. Growing popularity of wearable gaze trackers presents an opportunity to add gaze information on the egocentric video. We hypothesized three potential benefits of gaze-augmented egocentric video to support collaborative scenarios: support deictic referencing, enable grounding in communication, and enable better awareness of the collaborator's intentions. Previous research on using egocentric videos for real-world collaborative tasks has failed to show clear benefits of gaze point visualization. We designed a study, deconstructing a collaborative car navigation scenario, to specifically target the value of gaze-augmented video for intention prediction. Our results show that viewers of gaze-augmented video could predict the direction taken by a driver at a four-way intersection more accurately and more confidently than a viewer of the same video without the superimposed gaze point. Our study demonstrates that gaze augmentation can be useful and encourages further study in real-world collaborative scenarios.

GazeTorch: Enabling Gaze Awareness in Collaborative Physical Tasks Late-Breaking Works: Collaborative Technologies / Akkil, Deepak / James, Jobin Mathew / Isokoski, Poika / Kangas, Jari Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.1151-1158
ACM Digital Library Link
Summary: We present GazeTorch, a novel interface that provides gaze awareness during remote collaboration on physical tasks. GazeTorch uses a spotlight to display gaze information of the remote helper on the physical task space of the worker. We conducted a preliminary user study to evaluate user's subjective opinion on the quality of collaboration, using GazeTorch and a camera-only setup. Our preliminary results suggest that the participants felt GazeTorch made collaboration easier, made referencing and identifying of objects effortless, and improved the worker's confidence that the task was completed accurately. We conclude by presenting some novel application scenarios for the concept of augmenting real-time gaze information in the physical world.

PursuitAdjuster: an exploration into the design space of smooth pursuit-based widgets Video & demo abstracts / Špakov, Oleg / Isokoski, Poika / Kangas, Jari / Akkil, Deepak / Majaranta, Päivi Proceedings of the 2016 Symposium on Eye Tracking Research & Applications 2016-03-14 p.287-290
ACM Digital Library Link
Summary: In a study with 12 participants we compared two smooth pursuit based widgets and one dwell time based widget in adjusting a continuous value. The circular smooth pursuit widget was found to be about equally efficient as the dwell based widget in our color matching task. The scroll bar shaped smooth pursuit widget exhibited lower performance and lower user ratings.

Feedback for Smooth Pursuit Gaze Tracking Based Control / Kangas, Jari / Špakov, Oleg / Isokoski, Poika / Akkil, Deepak / Rantala, Jussi / Raisamo, Roope Proceedings of the 2016 Augmented Human International Conference 2016-02-25 p.6
ACM Digital Library Link
Summary: Smart glasses, like Google Glass or Microsoft HoloLens, can be used as interfaces that expand human perceptual, cognitive, and actuation capabilities in many everyday situations. Conventional manual interaction techniques, however, are not convenient with smart glasses whereas eye trackers can be built into the frames. This makes gaze tracking a natural input technology for smart glasses. Not much is known about interaction techniques for gaze-aware smart glasses. This paper adds to this knowledge, by comparing feedback modalities (visual, auditory, haptic, none) in a continuous adjustment technique for smooth pursuit gaze tracking. Smooth pursuit based gaze tracking has been shown to enable flexible and calibration free method for spontaneous interaction situations. Continuous adjustment, on the other hand, is a technique that is needed in many everyday situations such as adjusting the volume of a sound system or the intensity of a light source. We measured user performance and preference in a task where participants matched the shades of two gray rectangles. The results showed no statistically significant differences in performance, but clear user preference and acceptability for haptic and audio feedback.

Glance Awareness and Gaze Interaction in Smartwatches WIP Theme: Gesture and Multimodal / Akkil, Deepak / Kangas, Jari / Rantala, Jussi / Isokoski, Poika / Spakov, Oleg / Raisamo, Roope Extended Abstracts of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.2 p.1271-1276
ACM Digital Library Link
Summary: Smartwatches are widely available and increasingly adopted by consumers. The most common way of interacting with smartwatches is either touching a screen or pressing buttons on the sides. However, such techniques require using both hands. We propose glance awareness and active gaze interaction as alternative techniques to interact with smartwatches. We will describe an experiment conducted to understand the user preferences for visual and haptic feedback on a "glance" at the wristwatch. Following the glance, the users interacted with the watch using gaze gestures. Our results showed that user preferences differed depending on the complexity of the interaction. No clear preference emerged for complex interaction. For simple interaction, haptics was the preferred glance feedback modality.

Gaze gestures and haptic feedback in mobile devices Force input and haptic feedback / Kangas, Jari / Akkil, Deepak / Rantala, Jussi / Isokoski, Poika / Majaranta, Päivi / Raisamo, Roope Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.1 p.435-438
ACM Digital Library Link
Summary: Anticipating the emergence of gaze tracking capable mobile devices, we are investigating the use of gaze as an input modality in handheld mobile devices. We conducted a study of combining gaze gestures with vibrotactile feedback. Gaze gestures were used as an input method in a mobile device and vibrotactile feedback as a new alternative way to give confirmation of interaction events. Our results show that vibrotactile feedback significantly improved the use of gaze gestures. The tasks were completed faster and rated easier and more comfortable when vibrotactile feedback was provided.

Glasses with haptic feedback of gaze gestures Works-in-progress / Rantala, Jussi / Kangas, Jari / Akkil, Deepak / Isokoski, Poika / Raisamo, Roope Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.2 p.1597-1602
ACM Digital Library Link
Summary: We introduce eyeglasses that present haptic feedback when using gaze gestures for input. The glasses utilize vibrotactile actuators to provide gentle stimulation to three locations on the user's head. We describe two initial user studies that were conducted to evaluate the easiness of recognizing feedback locations and participants' preferences for combining the feedback with gaze gestures. The results showed that feedback from a single actuator was the easiest to recognize and also preferred when used with gaze gestures. We conclude by presenting future use scenarios that could benefit from gaze gestures and haptic feedback.

TraQuMe: a tool for measuring the gaze tracking quality Poster abstracts / Akkil, Deepak / Isokoski, Poika / Kangas, Jari / Rantala, Jussi / Raisamo, Roope Proceedings of the 2014 Symposium on Eye Tracking Research & Applications 2014-03-26 p.327-330
ACM Digital Library Link
Summary: Consistent measuring and reporting of gaze data quality is important in research that involves eye trackers. We have developed TraQuMe: a generic system to evaluate the gaze data quality. The quality measurement is fast and the interpretation of the results is aided by graphical output. Numeric data is saved for reporting of aggregate metrics for the whole experiment. We tested TraQuMe in the context of a novel hidden calibration procedure that we developed to aid in experiments where participants should not know that their gaze is being tracked. The quality of tracking data after the hidden calibration procedure was very close to that obtained with the Tobii's T60 trackers built-in 2 point, 5 point and 9 point calibrations.