HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,876,329
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Muller_F* Results: 9 Sorted by: Date  Comments?
Help Dates
Limit:   
FreeTop: Finding Free Spots for Projective Augmentation Late-Breaking Works: Engineering of Interactive Systems / Riemann, Jan / Khalilbeigi, Mohammadreza / Schmitz, Martin / Doeweling, Sebastian / Müller, Florian / Mühlhäuser, Max Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.1598-1606
ACM Digital Library Link
Summary: Augmenting the physical world using projection technologies or head-worn displays becomes increasingly popular in research and commercial applications. However, a common problem is interference between the physical surface's texture and the projection. In this paper, we present FreeTop, a combined approach to finding areas suitable for projection, which considers multiple aspects influencing projection quality, like visual texture and physical surface structure. FreeTop can be used in stationary and mobile settings for locating free areas in arbitrary physical settings suitable for projective augmentation and touch interaction.

ProxiWatch: Enhancing Smartwatch Interaction through Proximity-based Hand Input Late-Breaking Works: Novel Interactions / Müller, Florian / Günther, Sebastian / Dezfuli, Niloofar / Khalilbeigi, Mohammadreza / Mühlhäuser, Max Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.2617-2624
ACM Digital Library Link
Summary: Smartwatches allow ubiquitous and mobile interaction with digital contents. Because of the small screen sizes, traditional interaction techniques are often not applicable. In this work, we show how the degree of freedom offered by the elbow joint, i.e., flexion and extension, can be leveraged as an additional one-handed input modality for smartwatches. By moving the watch towards or away from the body, the user is able to provide input to the smartwatch without a second hand. We present the results of a controlled experiment focusing on the human capabilities for proximity-based interaction. Based on the results, we propose guidelines for designing proximity-based smartwatch interfaces and present ProxiWatch: a one-handed and proximity-based input modality for smartwatches alongside a prototypical implementation.

Liquido: Embedding Liquids into 3D Printed Objects to Sense Tilting and Motion Late-Breaking Works: Novel Interactions / Schmitz, Martin / Leister, Andreas / Dezfuli, Niloofar / Riemann, Jan / Müller, Florian / Mühlhäuser, Max Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.2688-2696
ACM Digital Library Link
Summary: Tilting and motion are widely used as interaction modalities in smart objects such as wearables and smart phones (e.g., to detect posture or shaking). They are often sensed with accelerometers. In this paper, we propose to embed liquids into 3D printed objects while printing to sense various tilting and motion interactions via capacitive sensing. This method reduces the assembly effort after printing and is a low-cost and easy-to-apply way of extending the input capabilities of 3D printed objects. We contribute two liquid sensing patterns and a practical printing process using a standard dual-extrusion 3D printer and commercially available materials. We validate the method by a series of evaluations and provide a set of interactive example applications.

SCWT: A Joint Workshop on Smart Connected and Wearable Things Workshops / Schnelle-Walka, Dirk / Limonad, Lior / Grosse-Puppendahl, Tobias / Lanir, Joel / Müller, Florian / Mecella, Massimo / Luyten, Kris / Kuflik, Tsvi / Brdiczka, Oliver / Mühlhäuser, Max Companion Proceedings of the 2016 International Conference on Intelligent User Interfaces 2016-03-07 v.2 p.3-5
ACM Digital Library Link
Summary: The increasing number of smart objects in our everyday life shapes how we interact beyond the desktop. In this workshop we discuss how advanced interactions with smart objects in the context of the Internet-of-Things should be designed from various perspectives, such as HCI and AI as well as industry and academia.

ShoeSoleSense: demonstrating a wearable foot interface for locomotion in virtual environments Video showcase presentations / Matthies, Denys / Müller, Franz / Anthes, Christoph / Kranzlmüller, Dieter Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.2 p.183-184
ACM Digital Library Link
Summary: User input in a virtual environment (VE) is usually accomplished through simple finger interactions, such as walking in a 3D scene by pressing a button. These interactions are not very suitable for movement in VE. Moving through scenes such as a safety training applications by walking-in-place while forgoing hand or finger input for other purposes enables a more realistic feeling. Already existing solutions, such as multi-directional treadmills, are still expensive and need additional fixation of the body. Others, like using external tracking that are usually accomplished by using statically installed cameras in CAVE-like-installations, also have limitations in terms of occlusion. The built prototype -- an insole Directly measures the pressure under the feet and hence enables a detection of movements, which is wirelessly forwarded to the scene manager server.

ShoeSoleSense: proof of concept for a wearable foot interface for virtual and real environments 3D interaction / Matthies, Denys J. C. / Müller, Franz / Anthes, Christoph / Kranzlmüller, Dieter Proceedings of the 2013 ACM Symposium on Virtual Reality Software and Technology 2013-10-06 p.93-96
ACM Digital Library Link
Summary: ShoeSoleSense is a proof of concept, novel body worn interface -- an insole that enables location independent hands-free interaction through the feet. Forgoing hand or finger interaction is especially beneficial when the user is engaged in real world tasks. In virtual environments as moving through safety training applications is often conducted via finger input, which is not very suitable. To enable a more intuitive interaction, alternative control concepts utilize gesture control, which is usually tracked by statically installed cameras in CAVE-like-installations. Since tracking coverage is limited, problems may also occur. The introduced prototype provides a novel control concept for virtual reality as well as real life applications. Demonstrated functions include movement control in a virtual reality installation such as moving straight, turning and jumping. Furthermore the prototype provides additional feedback by heating up the feet and vibrating in dedicated areas on the surface of the insole.

Leveraging the palm surface as an eyes-free tv remote control Work-in-progress / Dezfuli, Niloofar / Khalilbeigi, Mohammadreza / Huber, Jochen / Müller, Florian / Mühlhäuser, Max Extended Abstracts of ACM CHI'12 Conference on Human Factors in Computing Systems 2012-05-05 v.2 p.2483-2488
ACM Digital Library Citation
Summary: User input on television typically requires a mediator device such as a handheld remote control. While being a well-established interaction paradigm, a handheld device has serious drawbacks: it can be easily misplaced due to its mobility and in case of a touch screen interface, it also requires additional visual attention. Emerging interaction paradigms like 3D mid-air gestures using novel depth sensors such as Microsoft's Kinect aim at overcoming these limitations, but are known for instance to be tiring. In this paper, we propose to leverage the palm as an interactive surface for TV remote control. Our contribution is two-fold: (1) we have explored the conceptual design space in an exploratory study. (2) Based upon these results, we investigated the accuracy and effectiveness of such an interface in a controlled experiment. Our results show that the palm has the potential to be leveraged for device-less and eyes-free TV interactions without any third-party mediator device.

VisMeB: A Visual Metadata Browser 6: Video papers / Limbach, Tobias / Reiterer, Harald / Klein, Peter / Muller, Frank Proceedings of IFIP INTERACT'03: Human-Computer Interaction 2003-09-01 p.993
[video 21.4 MB]

Visualizing Metadata: LevelTable vs. GranularityTable in the SuperTable/Scatterplot Framework Human factors and ergonomics / Limbach, T. / Klein, P. / Muller, F. / Reiterer, H. Proceedings of the Tenth International Conference on Human-Computer Interaction 2003-06-22 v.2 p.1106-1110