HCI Bibliography Home | HCI Conferences | IUI Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
IUI Tables of Contents: 0607080910111213-113-214-114-215-115-216-116-2

Companion Proceedings of the 2014 International Conference on Intelligent User Interfaces

Fullname:Companion Publication of the 19th International Conference on Intelligent User Interfaces
Editors:Tsvi Kuflik; Oliviero Stock; Joyce Chai; Antonio Krüger
Location:Haifa, Israel
Dates:2014-Feb-24 to 2014-Feb-27
Volume:2
Publisher:ACM
Standard No:ISBN: 978-1-4503-2729-9; ACM DL: Table of Contents; hcibib: IUI14-2
Papers:24
Pages:88
Links:Conference Website
  1. IUI 2014-02-24 Volume 2
    1. Demonstrations
    2. Workshop summaries
    3. Doctoral consortiums

IUI 2014-02-24 Volume 2

Demonstrations

Deploying recommender system for the masses BIBAFull-Text 1-4
  David Ben Shimon; Michael Friedman; Johannes Hoerle; Alexander Tsikinovsky; Roland Gude; Rodion Aluchanov
Many small and mid-sized e-businesses wish to integrate a recommender system into their website. Integrating an existing recommender system to a website often requires certain expertise and programming efforts, thus incurs substantial investments and may not be justified by the added value of the recommender system. This demo presents a solution for integrating a recommender system as a service to an existing e-business without any programming efforts. The integration method is analogue to the way of the Google AdSense integration and the business model is adapted from the advertisements world. Initial feedback from real website owners indicates that such integration has a great benefit for both sides; the website owner and the Recommender System (RS) provider.
Microcosm: visual discovery, exploration and analysis of social communities BIBAFull-Text 5-8
  Haggai Roitman; Ariel Raviv; Shay Hummel; Shai Erera; David Konopniki
Social communities play an important role in many domains. While a lot of attention has been given to developing efficient methods for detecting and analyzing social communities, it still remains a great challenge to provide intuitive search interfaces for end-users who wish to discover and explore such communities. Trying to fill the gaps, in this demonstration we present Microcosm: a holistic solution for visual discovery, exploration and analysis of social communities.
Enhancing understanding of safety aspects in embedded systems through an interactive visual tool BIBAFull-Text 9-12
  Ragaad AlTarawneh; Jens Bauer; Shah Rukh Humayoun; Achim Ebert; Peter Liggesmeyer
In this work, we present a demonstration of a visual interactive tool called ESSAVis that helps different engineers in collaborating together for understanding the failure mechanisms in complex embedded systems. ESSAVis provides a 2Dplus3D visual user interface that integrates intuitively between different data sets related with embedded systems failure mechanisms. The tool accepts a CFT model describing a specific hazard in the underlying system, and a CAD model describing the geometry of system components. In this paper, we present different interaction options of ESSAVis that are used for intuitively extracting safety aspects of the underlying embedded system.
Multi-finger AR typing interface for mobile devices BIBAFull-Text 13-16
  Satoshi Sagara; Masakazu Higuchi; Takashi Komuro
In this paper, we propose a user interface that enables multi-finger typing in the space behind a mobile device. By using the augmented reality (AR) technology, a virtual keyboard is superimposed on the rear camera image, and a hand region of the camera image is again superimposed on that image, which makes it possible to perform input operation as if there were a real keyboard. The system recognizes only key pressing actions and does not recognize a hand or fingers, which enables stable recognition and multi-finger input. Further, key typing at any place on a plane and in the air is possible. Demonstration using an experimental device showed that multi-finger input using a virtual keyboard displayed on the screen was realized.
See-through mobile AR system for natural 3D interaction BIBAFull-Text 17-20
  Yuko Unuma; Takehiro Niikura; Takashi Komuro
In this paper, we propose an interaction system which displays see-through images on the mobile display and that allows a user to interact with virtual objects overlaid on the see-through image using the user's hand. In this system, the camera which tracks the user's viewpoint is attached to the front of the mobile display and the depth camera which captures color and depth images of the user's hand and the background scene is attached to the back of the mobile display. Natural interaction with virtual objects using the user's hand is realized by displaying images so that the appearance of a space through the mobile display is consistent with that of the real space from the user's viewpoint. We implemented two applications to the system and showed the usefulness of this system in various AR applications.
Mobile personal healthcare mediated by virtual humans BIBAFull-Text 21-24
  Anton Leuski; Rasiga Gowrisankar; Todd Richmond; Ari Shapiro; Yuyu Xu; Andrew Feng
We demonstrate Ally -- a prototype interface for a consumer-level medical diagnostic device. It is an interactive virtual character, -- Virtual Human (VH), -- that listens to user's concern, collects and processes sensor data, offers advice, guides the user through a self-administered medical tests, and answers the user's questions. The primary focus of this demo is on the VH, we describe and demonstrate the technologies for language analysis, dialogue management, response generation and presentation. The sensing and medical decision making components are simulated in the current system, but possible applications and extensions are discussed.
Creative user centric inspirational search BIBAFull-Text 25-28
  Fotis Paraskevopoulos; Maria Taramigkou; Efthimios Bothos; Dimitris Apostolou; Gregoris Mentzas
The demo paper describes the Creative User Centric Inspirational Search, which aims to leverage user inspiration in information seeking activities. CRUISE is an interactive exploratory search tool that combines diversification of content and sources with a user interface design that visualizes cues from the social chatter generated with microblogging services such as Twitter and lets users interactively explore the available information space. The tool is based on the observation that users often use the social chatter to follow links and initiate information seeking activities which can lead to unexpected discoveries which can in turn inspire them.
Demo: making plans scrutable with argumentation and natural language generation BIBAFull-Text 29-32
  Nava Tintarev; Roman Kutlak
Autonomous systems perform tasks without human guidance. Techniques for making autonomous systems scrutable and, hence, more transparent are required in order to support humans working with such systems. The Scrutable Autonomous Systems (SAsSy) demo shows a novel way of combining argumentation and natural language to generate a human understandable explanation dialogue. By interacting with SAsSy users are able to ask why a certain plan was selected for execution, why other alternatives were not selected, also allowing users to modify information in the system.
A facial affect mapping engine BIBAFull-Text 33-36
  Leonardo Impett; Peter Robinson; Tadas Baltrusaitis
Facial expressions play a crucial role in human interaction. Interactive digital games can help teaching people to both express and recognise them. Such interactive games can benefit from the ability to alter user expressions dynamically and in real-time. In this demonstration, we present the Facial Affect Mapping Engine (FAME), a framework for mapping and manipulating facial expressions across images and video streams. Our system is fully automatic runs in real-time and does not require any specialist hardware. FAME presents new possibilities for the designers of intelligent interactive digital games.
Visualizing sentiment: do you see what i mean? BIBAFull-Text 37-40
  Alan J. Wecker; Einat Minkov; Osnat Mokryn; Joel Lanir; Tsvi Kuflik
Many tools exist for extracting and visualizing key information from a corpus of text documents. However often, one would like to assess the sentiment and feelings that arise from a single document. This paper describes an interactive service that visualizes the sentiment of a specific document. The service enables the user to visualize the sentimental polarity of each paragraph to get a detailed impression; to quickly detect the polarity of emotional words; to identify subjective sentences within the text, and the grade level of language used in each sentence. Participants in an initial qualitative evaluation found the service fast and useful.
SUBVERTISER: mocking ads through mobile phones BIBAFull-Text 41-44
  Lorenzo Gatti; Marco Guerini; Oliviero Stock; Carlo Strapparava
As advertisements on posters in the street get more and more aggressive, our basic cognitive defense -- aimed at not perceiving those messages -- is not enough. One advanced defensive technique is based on transforming the perceived message into something different from what was originally meant in the message. The demo is based on an application for smartphones that creatively modifies the linguistic expression in a virtual copy of a poster. The mobile system is inspired by the counter-cultural art practice of "subvertising", and aims at experiencing aesthetic pleasure that relaxes the cognitive tension of the user.

Workshop summaries

SmartObjects: third workshop on interacting with smart objects BIBAFull-Text 45-46
  Dirk Schnelle-Walka; Jochen Huber; Stefan Radomski; Oliver Brdiczka; Kris Luyten; Max Mühlhäuser
The increasing number of smart objects in our everyday life shapes how we interact beyond the desktop. In this workshop we discuss how interaction with these smart objects should be designed from various perspectives.
Personalized access to cultural heritage (PATCH2014): the future of experiencing cultural heritage BIBAFull-Text 47-48
  Johan Oomen; Lora Aroyo; Cristina Gena; Alan Wecker
Since 2007, the PATCH workshop series have been gathering successfully researchers and professionals from various countries and institutions to discuss the topics of digital access to cultural heritage and specifically the personalization aspects in this process. Due to this rich history, the reach of the PATCH workshop in various research communities is extensive.
IDGEI 2014: 2nd international workshop on intelligent digital games for empowerment and inclusion BIBAFull-Text 49-50
  Lucas Paletta; Bjoern W. Schuller; Peter Robinson; Nicolas Sabouret
Digital Games for Empowerment and Inclusion have the potential to improve our society by preparing particular groups of people to meet social challenges in their everyday lives, and to do so in an enjoyable way through games. These games are developing rapidly to exploit new algorithms for computational intelligence supported by increasing availability of computing power to help analyze players' behavior, monitor their motivation and interest, and to adapt the progress of the games accordingly. Intelligent Digital Games for Empowerment and Inclusion (IDGEI) explore the use of machine intelligence in serious digital games. In this introduction and in this context, we summarize the second international workshop on IDGEI held at the International Conference on Intelligent User Interfaces (IUI) 2014.
Sketch: pen and touch recognition BIBAFull-Text 51-52
  Richard C. Davis; Aaron Adler
Sketch recognition has technically been around for 40 years, but it has come and gone several times due to the difficulty of the problem. With the rise of touch and pen enabled phones and tablets, sketch recognition is regaining popularity and public presence, and more people are becoming aware of and interested in this difficult, but valuable, problem. It is important to harness the Sketch Recognition community at this time to encourage the flourishing of this topic.

Doctoral consortiums

Developing sketch recognition and interaction techniques for intelligent surfaceless sketching user interfaces BIBAFull-Text 53-56
  Paul Taele; Tracy Hammond
As commercial motion-tracking sensors achieve greater reliability and ubiquity, intelligent sketching user interfaces can expand beyond traditional surface environments for richer surfaceless sketching interactions. However, relevant techniques for automatically recognizing sketches in surfaceless interaction spaces are either largely constrained, due to limited gesture input vocabularies from existing gesture recognition techniques; or unexplored, due to being adapted specifically for surface environments by existing sketch recognition techniques. This dissertation research therefore proposes to investigate techniques for developing intelligent surfaceless sketching user interfaces. The core research work will focus on investigating automated recognition techniques for better understanding the content of surfaceless sketches, and determining optimal interaction techniques for improving related intuitive sketching cues in those surfaceless interaction spaces.
Toward emotion regulation via physical interaction BIBAFull-Text 57-60
  Alwin de Rooij
Emotions can be regulated to fit a task in order to enhance task performance. Motor expressions can help regulate emotion. This paper briefly reports ongoing work on the design of physical interactions based on motor expressions that can help regulate emotion to fit a task. We argue that to be effective, such interactions must be made meaningful in relation to ongoing appraisal processes, and that such interactions can help regulate emotion via congruence, suppression, or incompatibility. We present previous work on the validation of these arguments within the context of supporting idea generation, and develop a roadmap for research that aims to translate these results to the design of physical interactions under device constraints. The research will enable designers of interactive technology to develop physical interactions that help regulate emotion with the aim to help people get the most out of their own capabilities.
Exploratory search interfaces: blending relevance, diversity, relationships and categories BIBAFull-Text 61-64
  Sivan Yogev
Exploratory search of scientific literature plays an essential part of a researcher's work. Efforts to provide interfaces supporting this task accomplished significant progress, but the field is open for further evolution. In this paper I present four basic design concepts identified in exploratory search interfaces: relevance, diversity, relationships and categories, and propose a novel browsing layout featuring a unique combination of these concepts.
Socially-aware interfaces for supporting co-located interaction BIBAFull-Text 65-68
  Gianluca Schiavo
Ambient intelligence refers to a vision of technology where physical environments are sensitive and responsive to people. One of the challenges to realize this vision is to leverage information available in the social context. My doctoral research focuses on how to design interfaces that support co-located multi-user interactions taking into account individual and group nonverbal behavior, such as proxemics, gaze direction and body movements. In particular, the research activities are twofold: to understand which nonverbal cues and social signals reflect engagement, cooperation and cohesion in co-located group activities and to design systems that can handle and manage this social information. I present an integrated research approach for designing multi-user interactions based on social signal processing and I discuss the progress-to-date toward the development of systems that can sense and respond to social context.
Recognition of student intentions in a virtual reality training environment BIBAFull-Text 69-72
  Yecheng Gu; Sergey Sosnovsky
This paper introduces a novel method for detecting and modeling intentions of students performing training tasks in a Virtual Reality (VR) environment enhanced with intelligent tutoring capabilities. Our VR-setup provides students with an immersive user interface, but produces noisy and low-level input, from which we need to recognize higher-level cognitive information about the student. The complexity of this task is amplified by the requirements of the target domain (child pedestrian safety), where students need to train complex skills in dynamic settings. We present an approach for this task, which combines the logic-based Event Calculus (EC) and probabilistic modeling.
Silent speech decoder using adaptive collection BIBAFull-Text 73-76
  Mariko Matsumoto
We investigated a classification method using brain computer interfaces (BCIs) for silent speech. Event-related potentials (ERPs) obtained when four subjects imagined the vocalization of two Japanese vowels while they remained silent and immobilized were recorded. We used an adaptive collection (AC) that adaptively selects suitable output signals of common spatial patterns (CSP) filters and its time duration for classification. The classification accuracies (CAs) were 73-92% for the pairwise classification /a/ vs. /u/ in the use of 63 channels and significantly better than previous study.
A non-command interface for automatic document provision during meetings BIBAFull-Text 77-80
  Hugo Lopez-Tovar; John Dowell
This research presents the concept of a non-command interface for a smart room to automatically detect when people talk about a document and whether it is present or not, as a fundamental prerequisite for missing document provision that doesn't require explicit requests, avoiding distraction from the main discourse. A study on how observers judge document usage in meetings is presented as a baseline and the conceptual framework is briefly explained. Finally, an exploratory experiment is reported. These elements demonstrate the research feasibility and define the techniques needed to build the agent.
Supporting carers through intelligent technology BIBAFull-Text 81-84
  Kirsten Smith
Informal carers lack adequate practical and emotional support. This PhD investigates how a software agent could be used to help maintain a carer's personal social network by mediating communication and facilitating the provision of emotional and practical support. The agent should use features of the carer and their social network to provide a personalized support interface.
Wearable audio journal and mobile application to capture automatic thoughts in patients undergoing cognitive behavioral therapy BIBAFull-Text 85-88
  Devyani Jain; Manikandan Hariharan Kala
By replacing the hand-written 'thought records', used by Cognitive behavioral therapy (CBT) patients, with a Wearable audio journal that works in tandem with a smartphone, we can help patients capture their automatic thoughts at the moment of its occurrence and facilitate in the posterior analysis of the data along with the therapist. Speech provides richer clues about the emotional state of mind of the patient and thus could possibly help in better therapy.