HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,905,753
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Jylha_A* Results: 13 Sorted by: Date  Comments?
Help Dates
Limit:   
Designing a Willing-to-Use-in-Public Hand Gestural Interaction Technique for Smart Glasses Everyday Objects as Interaction Surfaces / Hsieh, Yi-Ta / Jylhä, Antti / Orso, Valeria / Gamberini, Luciano / Jacucci, Giulio Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.4203-4215
ACM Digital Library Link
Summary: Smart glasses suffer from obtrusive or cumbersome interaction techniques. Studies show that people are not willing to publicly use, for example, voice control or mid-air gestures in front of the face. Some techniques also hamper the high degree of freedom of the glasses. In this paper, we derive design principles for socially acceptable, yet versatile, interaction techniques for smart glasses based on a survey of related work. We propose an exemplary design, based on a haptic glove integrated with smart glasses, as an embodiment of the design principles. The design is further refined into three interaction scenarios: text entry, scrolling, and point-and-select. Through a user study conducted in a public space we show that the interaction technique is considered unobtrusive and socially acceptable. Furthermore, the performance of the technique in text entry is comparable to state-of-the-art techniques. We conclude by reflecting on the advantages of the proposed design.

A Wearable Multimodal Interface for Exploring Urban Points of Interest Oral Session 6: Mobile and Wearable / Jylhä, Antti / Hsieh, Yi-Ta / Orso, Valeria / Andolina, Salvatore / Gamberini, Luciano / Jacucci, Giulio Proceedings of the 2015 International Conference on Multimodal Interaction 2015-11-09 p.175-182
ACM Digital Library Link
Summary: Locating points of interest (POIs) in cities is typically facilitated by visual aids such as paper maps, brochures, and mobile applications. However, these techniques require visual attention, which ideally should be on the surroundings. Non-visual techniques for navigating towards specific POIs typically lack support for free exploration of the city or more detailed guidance. To overcome these issues, we propose a multimodal, wearable system for alerting the user of nearby recommended POIs. The system, built around a tactile glove, provides audio-tactile cues when a new POI is in the vicinity, and more detailed information and guidance if the user expresses interest in this POI. We evaluated the system in a field study, comparing it to a visual baseline application. The encouraging results show that the glove-based system helps keep the attention on the surroundings and that its performance is on the same level as that of the baseline.

Design challenges in motivating change for sustainable urban mobility / Gabrielli, Silvia / Forbes, Paula / Jylhä, Antti / Wells, Simon / Sirén, Miika / Hemminki, Samuli / Nurmi, Petteri / Maimone, Rosa / Masthoff, Judith / Jacucci, Giulio Computers in Human Behavior 2014-12 v.41 n.0 p.416-423
Keywords: Persuasive sustainability
Keywords: User studies
Keywords: Behavior change
Keywords: Social media
Keywords: Urban mobility interventions
Link to Article at sciencedirect
Summary: In recent years, the design and deployment of persuasive interventions for inducing sustainable urban mobility behaviors has become a very active research field, leveraging on the pervasive usage of social media and mobile apps by citizens in their daily life. Several challenges in designing and assessing motivational features for effective and long-lasting behavior change in this area have also been identified, such as the focus of most solutions on targeting and prescribing individual (versus collective) mobility choices, as well as a general lack of large-scale evaluations on the impact of these solutions on citizens' life. This paper reports lessons learnt from three parallel and complementary user studies, where motivational features for sustainable urban mobility, including social influence strategies delivered through social media, were prototyped, tested and refined. By reflecting on our results and design experiences so far, we aim to provide better guidance for future development of more effective solutions supporting citizens' adoption of sustainable mobility behaviors in urban settings.

How carat affects user behavior: implications for mobile battery awareness applications Battery life and energy harvesting / Athukorala, Kumaripaba / Lagerspetz, Eemil / von Kügelgen, Maria / Jylhä, Antti / Oliner, Adam J. / Tarkoma, Sasu / Jacucci, Giulio Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.1 p.1029-1038
ACM Digital Library Link
Summary: Mobile devices have limited battery life, and numerous battery management applications are available that aim to improve it. This paper examines a large-scale mobile battery awareness application, called Carat, to see how it changes user behavior with long-term use. We conducted a survey of current Carat Android users and analyzed their interaction logs. The results show that long-term Carat users save more battery, charge their devices less often, learn to manage their battery with less help from Carat, have a better understanding of how Carat works, and may enjoy competing against other users. Based on these findings, we propose a set of guidelines for mobile battery awareness applications: battery awareness applications should make the reasoning behind their recommendations understandable to the user, be tailored to retain long-term users, take the audience into account when formulating feedback, and distinguish third-party and system applications.

BubblesDial: exploring large display content graphs on small devices Visualization techniques / Bergstrom-Lehtovirta, Joanna / Eklund, Tommy / Jylhä, Antti / Kuikkaniemi, Kai / An, Chao / Jacucci, Giulio Proceedings of the 2013 International Conference on Mobile and Ubiquitous Multimedia 2013-12-02 p.1
ACM Digital Library Link
Summary: Large interfaces are fixed to a certain use context, for example to a physical smart space. Mobile counterparts of public interfaces can allow user to continue interacting with the content also when leaving the space. However, wall applications make use of a large display surface and fitting the same user interface to the constraints of a mobile screen is challenging. Starting with Bubble Wall -- an information exploration application for multitouch walls -- we developed interfaces for browsing the same content graphs on mobile devices. A comparison study in exploration and navigation tasks was conducted with two mobile interfaces reproducing the large screen interactions: BubbleSpace with a more faithful redesign and BubblesDial reducing the interactions for better fit to a small screen. The BubblesDial scored significantly better in usability and performance evaluation, especially when priming with use of the Bubble Wall. We also present implications for the redesign of these large content graph interactions for mobile use.

Comparing eye and gesture pointing to drag items on large screens Poster / Kosunen, Ilkka / Jylha, Antti / Ahmed, Imtiaj / An, Chao / Chech, Luca / Gamberini, Luciano / Cavazza, Marc / Jacucci, Giulio Proceedings of the 2013 ACM International Conference on Interactive Tabletops and Surfaces 2013-10-06 p.425-428
ACM Digital Library Link
Summary: Large screens are populating a variety of settings motivating research on appropriate interaction techniques. While gesture is popularized by depth cameras we contribute with a comparison study showing how eye pointing is a valuable substitute to gesture pointing in dragging tasks. We compare eye pointing combined with gesture selection to gesture pointing and selection. Results clearly show that eye pointing combined with a selection gesture allows more accurate and faster dragging.

MatkaHupi: a persuasive mobile application for sustainable mobility Poster, demo, & video presentations / Jylhä, Antti / Nurmi, Petteri / Sirén, Miika / Hemminki, Samuli / Jacucci, Giulio Adjunct Proceedings of the 2013 International Joint Conference on Pervasive and Ubiquitous Computing 2013-09-08 v.2 p.227-230
ACM Digital Library Link
Summary: With the advances in smartphone technologies, sustainable mobility has become an active research topic in the field of ubiquitous computing. We present a persuasive mobile application that automatically tracks the transportation modes and CO2 emissions of the trips of the user and utilizes this information to present a set of actionable mobility challenges to the user. A longitudinal pilot experiment with the system showed that subjects perceived the concept of challenges as positive, with constructive findings to inform further development of the application especially related to personalized challenges.

SiMPE: 8th workshop on speech and sound in mobile and pervasive environments Workshops / Nanavati, Amit A. / Rajput, Nitendra / Srivastava, Saurabh / Erkut, Cumhur / Jylhä, Antti / Rudnicky, Alexander I. / Serafin, Stefania / Turunen, Markku Proceedings of 2013 Conference on Human-computer interaction with mobile devices and services 2013-08-27 2013-08-27 p.626-629
ACM Digital Library Link
Summary: The SiMPE workshop series started in 2006 with the goal of enabling speech processing on mobile and embedded devices. The SiMPE 2012 workshop extended the notion of audio to non-speech "Sounds" and thus the expansion became "Speech and Sound". SiMPE 2010 and 2011 brought together researchers from the speech and the HCI communities. Speech User interaction in cars was a focus area in 2009. Multimodality got more attention in SiMPE 2008. In SiMPE 2007, the focus was on developing regions.
    With SiMPE 2013, the 8th in the series, we continue to explore the area of speech along with sound. Akin to language processing and text-to-speech synthesis in the voice-driven interaction loop, sensors can track continuous human activities such as singing, walking, or shaking the mobile phone, and non-speech audio can facilitate continuous interaction. The technologies underlying speech processing and sound processing are quite different and these communities have been working mostly independent of each other. And yet, for multimodal interactions on the mobile, it is perhaps natural to ask whether and how speech and sound can be mixed and used more effectively and naturally.

Rhythmic walking interactions with auditory feedback: an exploratory study / Jylhä, Antti / Serafin, Stefania / Erkut, Cumhur Proceedings of the 2012 Audio Mostly Conference: A Conference on Interaction with Sound 2012-09-26 p.68-75
ACM Digital Library Link
Summary: Walking is a natural rhythmic activity that has become of interest as a means of interacting with software systems such as computer games. Therefore, designing multimodal walking interactions calls for further examination. This exploratory study presents a system capable of different kinds of interactions based on varying the temporal characteristics of the output, using the sound of human walking as the input. The system either provides a direct synthesis of a walking sound based on the detected amplitude envelope of the user's footstep sounds, or provides a continuous synthetic walking sound as a stimulus for the walking human, either with a fixed tempo or a tempo adapting to the human gait. In a pilot experiment, the different interaction modes are studied with respect to their effect on the walking tempo and the experience of the subjects. The results tentatively outline different user profiles in interacting with such a system.

Auditory feedback in an interactive rhythmic tutoring system / Jylhä, Antti / Erkut, Cumhur Proceedings of the 2011 Audio Mostly Conference: A Conference on Interaction with Sound 2011-09-07 p.109-115
ACM Digital Library Link
Summary: We present the recent developments in the design of audio-visual feedback in iPalmas, the interactive Flamenco rhythm tutor. Based on evaluation of the original implementation, we have re-designed the interface to better support the user in learning and performing rhythmic patterns. The system measures the performance parameters of the user and provides auditory feedback on the performance with different sounds corresponding to different performance attributes. The design of these sounds is informed by several attributes derived from the evaluation. We propose informative, non-intrusive. and archetypal sounds to be used in the system.

A Structured Design and Evaluation Model with Application to Rhythmic Interaction Displays / Erkut, Cumhur / Jylhä, Antti / Discioglu, Reha NIME 2011: New Interfaces for Musical Expression 2011-05-30 p.477-480
Keywords: Rhythmic interaction, multimodal displays, sonification, UML
www.nime.org/proceedings/2011/nime2011_477.pdf
Summary: We present a generic, structured model for design and evaluation of musical interfaces. This model is development oriented, and it is based on the fundamental function of the musical interfaces, i.e., to coordinate the human action and perception for musical expression, subject to human capabilities and skills. To illustrate the particulars of this model and present it in operation, we consider the previous design and evaluation phase of iPalmas, our testbed for exploring rhythmic interaction. Our findings inform the current design phase of iPalmas visual and auditory displays, where we build on what has resonated with the test users, and explore further possibilities based on the evaluation results.

Simulation of rhythmic learning: a case study / Jylhä, Antti / Erkut, Cumhur / Pesonen, Matti / Ekman, Inger Proceedings of the 2010 Audio Mostly Conference: A Conference on Interaction with Sound 2010-09-15 p.20
ACM Digital Library Link
Summary: Simulation of human interaction with computational systems can inform their design and provide means for designing new, intelligent systems capturing some of the essence of human behavior. We describe a system simulating a situation, where a virtual tutor is teaching rhythms to a human learner. In this simulation, we virtualize the human behavior related to the learning of new rhythms. We inform the design of the system based on an experiment, in which a virtual tutor taught Flamenco hand clapping patterns to human subjects. Based on the findings on interaction with the system and learning of the patterns, we are simulating this learning situation with a virtual learning clapper. We also discuss the future work to be undertaken for more realistic, agent-based simulation of rhythmic interaction.

A hand clap interface for sonic interaction with the computer Interactivity: touch & feel / Jylhä, Antti / Erkut, Cumhur Proceedings of ACM CHI 2009 Conference on Human Factors in Computing Systems 2009-04-04 v.2 p.3175-3180
Keywords: audio interfaces, hand clapping, human-computer interaction, sonic interaction design
ACM Digital Library Link
Summary: We present a hand clapping interface for sonic interaction with the computer. The current implementation has been built on the Pure Data (PD) software. The interface makes use of the cyclic nature of hand clapping and recognition of the clap type, and enables interactive control over different applications. Three prototype applications for the interface are presented: a virtual crowd of clappers, controlling the tempo of music, and a simple sampler. Preliminary tests indicate that rather than having total control via the interface, the user negotiates with the computer to control the tempo.