HCI Bibliography Home | HCI Conferences | ERM4HCI Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
ERM4HCI Tables of Contents: 14

Proceedings of the 2014 ICMI Workshop on Emotion Representation and Modelling in Human Machine Interaction Systems

Fullname:Proceedings of the 2nd ICMI Workshop on Emotion Representation and Modelling in Human-Computer Interaction Systems
Editors:Kim Hartmann; Björn W. Schuller; Klaus R. Scherer; Ronald Böck
Location:Istanbul, Turkey
Dates:2014-Nov-16
Publisher:ACM
Standard No:ISBN: 978-1-4503-0124-4; ACM DL: Table of Contents; hcibib: ERM4HCI14
Papers:6
Pages:36
Links:Workshop Website | Conference Website
  1. Emotion Detection
  2. Human-Machine Interaction

Emotion Detection

An Initial Analysis of Structured Video Interviews by Using Multimodal Emotion Detection BIBAFull-Text 1-6
  Lei Chen; Su-Youn Yoon; Chee Wee Leong; Michelle Martin; Min Ma
Recently online video interviews have been increasingly used in the employment process. Though several automatic techniques have emerged to analyze the interview videos, so far, only simple emotion analyses have been attempted, e.g. counting the number of smiles on the face of an interviewee. In this paper, we report our initial study of employing advanced multimodal emotion detection approaches for the purpose of measuring performance on an interview task that elicits emotion. On an acted interview corpus we created, we performed our evaluations using a Speech-based Emotion Recognition (SER) system, as well as an off-the-shelf facial expression analysis toolkit (FACET). While the results obtained suggest the promise of using FACET for emotion detection, the benefits of employing the SER are somewhat limited.
A Neural Network Based Approach to Social Touch Classification BIBAFull-Text 7-12
  Siewart van Wingerden; Tobias J. Uebbing; Merel M. Jung; Mannes Poel
Touch is an important interaction modality in social interaction, for instance touch can communicate emotions and can intensify emotions communicated by other modalities. In this paper we explore the use of Neural Networks for the classification of touch. The exploration and assessment of Neural Networks (NNs) is based on the Corpus of Social Touch established by Jung et al. This corpus was split in a train set (65%) and test set (35%), the train set was used to find the optimal parameters for the NN and for training the final model. Also different feature sets were investigated; the basic feature set included in the corpus, energy-histogram and dynamical features. Using all features led to the best performance of 64% on the test set, using a NN consisting of one hidden layer with 46 neurones. The confusion matrix showed the expected high confusion between pat-tap and grab-squeeze. A leave-one-subject-out approach lead to a performance of 54%, which is comparable with the results of Jung et al.
Emotion Expression and Conversation Assessment in First Acquaintance Dialogues BIBAFull-Text 13-18
  Patrizia Paggio
This paper addresses the issue of how the expression of emotions by speakers impacts the way dialogue participants experience the interaction. The hypothesis investigated here is that positive emotion expression has a positive effect, and vice-versa. Facial expression of emotions was coded in a corpus of first acquaintance dialogues using an adapted version of the PAD scheme. After the interaction dialogue participants were asked to score their experience along a number of parameters. The effect of emotion expression on these scores is investigated, and it is found that the Pleasure dimension has an effect on the assessment of the conversation by both participants. Different interaction effects with Arousal and Dominance are found for speakers and listeners, respectively.

Human-Machine Interaction

A Design Platform for Emotion-Aware User Interfaces BIBAFull-Text 19-24
  Eunjung Lee; Gyu-Wan Kim; Byung-Soo Kim; Mi-Ae Kang
Machine recognition of emotion has become one of the main targets for developing next-generation user interaction in computer systems. While affect recognition technology has achieved substantial progress recently, the application of user emotion recognition to software user interface is in its early stages. In this paper, we describe the development of an emotion-aware user interface with a focus on visual appearance. Further, we propose an emotion-aware UI-authoring platform that helps designers create emotion-aware visual effects. In order to demonstrate its feasibility, we developed a prototype framework built with the authoring tool DAT4UX. The tool can integrate the resulting design into a mobile application equipped with emotion recognition facility. A proof-of-concept application featuring an emotion-aware interface is developed using the tool.
A Model to Incorporate Emotional Sensitivity into Human Computer Interactions BIBAFull-Text 25-30
  Sweety Ramnani; Ravi Prakash Gorthi
Human machine interaction has started receiving much attention in recent years. The 'Link' which plays a major role in interfacing and facilitating human machine interaction, is getting more and more equipped with different modalities (such as natural language processing, intelligent reasoning etc.) so as to shift the cognitive load of understanding from human to machine. This makes machines not only be able to respond in more appropriate and relevant ways but also reduces the human efforts (in making the system understand and act) to the extent possible. This research work contributes to the design and implementation of the interface by presenting a unified model that takes human machine interaction to a advanced level of understanding with an intelligent interface that not only consists of incorporating emotional intelligence into machines but also identifying contextual information, intention recognition (for a given user's textual response) leading to strategic adaptive dialogue management. The motivation for the system proposed is to provide an inexpensive, unobtrusive means for human beings to interact with computer applications in a meaningful way that requires minimal effort. Research work is completely dedicated to provide a baseline mechanism for achieving stress free satisfying interaction between human and machines. Our system is tested against the existing IVR technology and corresponding results are captured.
Detection of Emotional Events utilizing Support Vector Methods in an Active Learning HCI Scenario BIBAFull-Text 31-36
  Patrick Thiam; Sascha Meudt; Markus Kächele; Günther Palm; Friedhelm Schwenker
In recent years the fields of affective computing and emotion recognition have experienced a steady increase in attention and especially the creation and analysis of multi-modal corpora has been the focus of intense research. Plausible annotation of this data, however is an enormous problem. In detail emotion annotation is very time consuming, cumbersome and sensitive with respect to the annotator. Furthermore emotional reactions are often very sparse in HCI scenarios resulting in a large annotation overhead to gather the interesting moments of a recording, which in turn are highly relevant for powerful features, classifiers and fusion architectures. Active learning techniques provide methods to improve the annotation processes since the annotator is asked to only label the relevant instances of a given dataset. In this work an unsupervised one-class Support Vector Machine is used to build a background model of non-emotional sequences on a novel HCI dataset. The human annotator is iteratively asked to label instances that are not well explained by the background model, which in turn renders them candidates for being interesting events such as emotional reactions that diverge from the norm. The outcome of the active learning procedure is a reduced dataset of only 14% the size of the original dataset that contains most of the significant information, in this case more than 75% of the emotional events.