HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,284,103
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Gillies_M* Results: 9 Sorted by: Date  Comments?
Help Dates
Limit:   
Human-Centred Machine Learning Workshop Summaries / Gillies, Marco / Fiebrink, Rebecca / Tanaka, Atau / Garcia, Jérémie / Bevilacqua, Frédéric / Heloir, Alexis / Nunnari, Fabrizio / Mackay, Wendy / Amershi, Saleema / Lee, Bongshin / d'Alessandro, Nicolas / Tilmanne, Joëlle / Kulesza, Todd / Caramiaux, Baptiste Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.3558-3565
ACM Digital Library Link
Summary: Machine learning is one of the most important and successful techniques in contemporary computer science. It involves the statistical inference of models (such as classifiers) from data. It is often conceived in a very impersonal way, with algorithms working autonomously on passively collected data. However, this viewpoint hides considerable human work of tuning the algorithms, gathering the data, and even deciding what should be modeled in the first place. Examining machine learning from a human-centered perspective includes explicitly recognising this human work, as well as reframing machine learning workflows based on situated human working practices, and exploring the co-adaptation of humans and systems. A human-centered understanding of machine learning in human context can lead not only to more usable machine learning tools, but to new ways of framing learning computationally. This workshop will bring together researchers to discuss these issues and suggest future research questions aimed at creating a human-centered approach to machine learning.

Emotional and Functional Challenge in Core and Avant-garde Games Notes! Notes! Notes! / Cole, Tom / Cairns, Paul / Gillies, Marco Proceedings of the 2015 ACM SIGCHI Annual Symposium on Computer-Human Interaction in Play 2015-10-05 p.121-126
ACM Digital Library Link
Summary: Digital games are a wide, diverse and fast developing art form, and it is important to analyse games that are pushing the medium forward to see what design lessons can be learned. However, there are no established criteria to determine which games show these more progressive qualities.
    Grounded theory methodology was used to analyse language used in games reviews by critics of both 'core gamer' titles and those titles with more avant-garde properties. This showed there were two kinds of challenge being discussed -- emotional and functional which appear to be, at least partially, mutually exclusive. Reviews of 'core' and 'avant-garde' games had different measures of purchase value, primary emotions, and modalities of language used to discuss the role of audiovisual qualities. Emotional challenge, ambiguity and solitude are suggested as useful devices for eliciting emotion from the player and for use in developing more 'avant-garde' games, as well as providing a basis for further lines of inquiry.

Applying the CASSM Framework to Improving End User Debugging of Interactive Machine Learning Interactive Machine Learning / Decision Making / Topic Modeling / Robotics / Gillies, Marco / Kleinsmith, Andrea / Brenton, Harry Proceedings of the 2015 International Conference on Intelligent User Interfaces 2015-03-29 v.1 p.181-185
ACM Digital Library Link
Summary: This paper presents an application of the CASSM (Concept-based Analysis of Surface and Structural Misfits) framework to interactive machine learning for a bodily interaction domain. We developed software to enable end users to design full body interaction games involving interaction with a virtual character. The software used a machine learning algorithm to classify postures as based on examples provided by users. A longitudinal study showed that training the algorithm was straightforward, but that debugging errors was very challenging. A CASSM analysis showed that there were fundamental mismatches between the users concepts and the working of the learning system. This resulted in a new design in which aimed to better align both the learning algorithm and user interface with users' concepts. This work provides and example of how HCI methods can be applied to machine learning in order to improve its usability and provide new insights into its use.

Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces / Zamborlin, Bruno / Bevilacqua, Frederic / Gillies, Marco / D'Inverno, Mark ACM Transactions on Interactive Intelligent Systems 2014-01 v.3 n.4 p.22
ACM Digital Library Link
Summary: This article presents Gesture Interaction DEsigner (GIDE), an innovative application for gesture recognition. Instead of recognizing gestures only after they have been entirely completed, as happens in classic gesture recognition systems, GIDE exploits the full potential of gestural interaction by tracking gestures continuously and synchronously, allowing users to both control the target application moment to moment and also receive immediate and synchronous feedback about system recognition states. By this means, they quickly learn how to interact with the system in order to develop better performances. Furthermore, rather than learning the predefined gestures of others, GIDE allows users to design their own gestures, making interaction more natural and also allowing the applications to be tailored by users' specific needs. We describe our system that demonstrates these new qualities -- that combine to provide fluid gesture interaction design -- through evaluations with a range of performers and artists.

Customizing by doing for responsive video game characters / Kleinsmith, Andrea / Gillies, Marco International Journal of Human-Computer Studies 2013-07 v.71 n.7/8 p.775-784
Keywords: Interactive machine learning
Keywords: Embodied design
Keywords: Body expressions
Keywords: Video game characters
Link to Article at sciencedirect
Summary: This paper presents a game in which players can customize the behavior of their characters using their own movements while playing the game. Players' movements are recorded with a motion capture system. The player then labels the movements and uses them as input to a machine learning algorithm that generates a responsive behavior model. This interface supports a more embodied approach to character design that we call "Customizing by Doing'. We present a user study which shows that using their own movements made the users feel more engaged with the game and the design process, due in large part to a feeling of personal ownership of the movement.

Exploring choreographers' conceptions of motion capture for full body interaction Full body interaction / Gillies, Marco / Worgan, Max / Peppe, Hestia / Robinson, Will / Kov, Nina Proceedings of the 25th BCS Conference on Human-Computer Interaction 2011-07-04 p.205-210
ACM Digital Library Link
Summary: We present the results of a group interview of choreographers aimed at understanding their conceptions of how movement can be used to in live performance. This understanding intended to inform research into full body interaction for live performance and other more general full body interfaces. The results of the interview suggest a new way of conceiving of interaction with digital technology, neither as a representation of movement, not as an interface that responds to movement but as a means of transforming movement. This transformed movement can then serve as a starting point for a dancers responses to transformations of their own movement thus setting up an improvisational feedback loop.

Piavca: a framework for heterogeneous interactions with virtual characters / Gillies, Marco / Pan, Xueni / Slater, Mel Virtual Reality 2010-12 v.14 n.4 p.221-228
Link to Digital Content at Springer
Summary: This paper presents a virtual character animation system for real-time multimodal interaction in an immersive virtual reality setting. Human to human interaction is highly multimodal, involving features such as verbal language, tone of voice, facial expression, gestures and gaze. This multimodality means that, in order to simulate social interaction, our characters must be able to handle many different types of interaction and many different types of animation, simultaneously. Our system is based on a model of animation that represents different types of animations as instantiations of an abstract function representation. This makes it easy to combine different types of animation. It also encourages the creation of behavior out of basic building blocks, making it easy to create and configure new behaviors for novel situations. The model has been implemented in Piavca, an open source character animation system.

EMMA: an automated intelligent actor in e-drama Short papers / Zhang, Li / Gillies, Marco / Barnden, John Proceedings of the 2008 International Conference on Intelligent User Interfaces 2008-01-13 p.409-412
ACM Digital Library Link
Summary: We report work on adding an improvisational AI actor and 3D emotional animation to an existing edrama program, a system for dramatic improvisation in simple virtual scenarios. The improvisational AI actor has an affect-detection component, aimed at detecting affective aspects of human-controlled characters' textual input. It also makes an appropriate response to stimulate the improvisation based on this affective understanding. A distinctive feature of our work is a focus on the metaphorical ways in which affect is conveyed. Moreover, we have also introduced how the detected affective states activate the animation engine to produce emotional gestures for human-controlled characters. Finally, we report user testing conducted for the AI actor. Our work contributes to the conference themes on affective user interfaces, natural language processing and emotionally believable gesture generation.

Applying Direct Manipulation Interfaces to Customizing Player Character Behaviour Authoring Tools 1 / Gillies, Marco Proceedings of the 2006 International Conference on Entertainment Computing 2006-09-20 p.175-186
Link to Digital Content at Springer
Summary: The ability customize a players avatar (their graphical representation) is one of the most popular features of online games and graphical chat environments. Though customizing appearance is a common ability in most games, creating tools for customizing a character's behaviour is still a difficult problem. We propose a methodology, based on direct manipulation, that allows players to specify the type of behaviour they would like in a given context. This methodology is iterative, with the player performing a number of different customizations in different contexts. Players are also able to continue customizing their character during play, with commands that can have long term and permanent effects.