HCI Bibliography : Search Results skip to search form | skip to results |
Database updated: 2016-05-10 Searches since 2006-12-01: 32,810,933
director@hcibib.org
Hosted by ACM SIGCHI
The HCI Bibliogaphy was moved to a new server 2015-05-12 and again 2016-01-05, substantially degrading the environment for making updates.
There are no plans to add to the database.
Please send questions or comments to director@hcibib.org.
Query: Hodges_S* Results: 33 Sorted by: Date  Comments?
Help Dates
Limit:   
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 33 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 06 | 94 |
Expressy: Using a Wrist-worn Inertial Measurement Unit to Add Expressiveness to Touch-based Interactions Touch Interaction / Wilkinson, Gerard / Kharrufa, Ahmed / Hook, Jonathan / Pursglove, Bradley / Wood, Gavin / Haeuser, Hendrik / Hammerla, Nils Y. / Hodges, Steve / Olivier, Patrick Proceedings of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.1 p.2832-2844
ACM Digital Library Link
Summary: Expressiveness, which we define as the extent to which rich and complex intent can be conveyed through action, is a vital aspect of many human interactions. For instance, paint on canvas is said to be an expressive medium, because it affords the artist the ability to convey multifaceted emotional intent through intricate manipulations of a brush. To date, touch devices have failed to offer users a level of expressiveness in their interactions that rivals that experienced by the painter and those completing other skilled physical tasks. We investigate how data about hand movement -- provided by a motion sensor, similar to those found in many smart watches or fitness trackers -- can be used to expand the expressiveness of touch interactions. We begin by introducing a conceptual model that formalizes a design space of possible expressive touch interactions. We then describe and evaluate Expressy, an approach that uses a wrist-worn inertial measurement unit to detect and classify qualities of touch interaction that extend beyond those offered by today's typical sensing hardware. We conclude by describing a number of sample applications, which demonstrate the enhanced, expressive interaction capabilities made possible by Expressy.

Video Showcase: Using Expressy to Showcase Expressiveness in Touch-based Interactions Video Showcase Presentations / Wilkinson, Gerard / Green, David Philip / Wood, Gavin / Kharrufa, Ahmed / Hook, Jonathan / Pursglove, Bradley / Haeuser, Hendrik / Hammerla, Nils Y. / Hodges, Steve / Olivier, Patrick Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.11
ACM Digital Library Link
Summary: We present a video demonstration of how information about hand movements, generated from a wrist-worn IMU (inertial measurement unit), can be used to provide expressiveness to touch-based interactions. The IMU identifies features that were not previously accessible, such as instantaneous force, wrist roll and pitch. We demonstrate a range of applications that have been extended using Expressy, a system we describe in more detail in the full paper [1]. Tap force allows users to express their intent behind an interaction before touch. Wrist roll and pitch enriches the touch during the interaction. Flick force and wrist roll allows users to follow-up their touch interaction.

Interactivity: Using Expressy to Demonstrate Expressiveness in Touch-based Interactions Interactivity Demos / Wilkinson, Gerard / Wood, Gavin / Hook, Jonathan / Nappey, Tom / Kharrufa, Ahmed / Pursglove, Bradley / Haeuser, Hendrik / Hammerla, Nils Y. / Hodges, Steve / Olivier, Patrick Extended Abstracts of the ACM CHI'16 Conference on Human Factors in Computing Systems 2016-05-07 v.2 p.3800-3803
ACM Digital Library Link
Summary: We present an interactivity demonstration of Expressy. A system that augments existing touchscreen devices with a variety of continuous expressive interaction capabilities, using movement data from a wrist-worn IMU. Our demonstration comprises a set of applications that show how the expressive touch interaction capabilities, offered by Expressy, can enable intuitive and meaningful interactions, in contexts including productivity, entertainment and lifestyle apps. This demo submission accompanies a full paper, describing a conceptual model of expressive touch interaction and the implementation and evaluation of Expressy.

Using IMUs to Identify Supervisors on Touch Devices HCI for Education / Kharrufa, Ahmed / Nicholson, James / Dunphy, Paul / Hodges, Steve / Briggs, Pam / Olivier, Patrick Proceedings of IFIP INTERACT'15: Human-Computer Interaction, Part II 2015-09-14 v.2 p.565-583
Keywords: IMU; Association; Authentication; Touch interaction; UI design
Link to Digital Content at Springer
Summary: In addition to their popularity as personal devices, tablets, are becoming increasingly prevalent in work and public settings. In many of these application domains a supervisor user -- such as the teacher in a classroom -- oversees the function of one or more devices. Access to supervisory functions is typically controlled through the use of a passcode, but experience shows that keeping this passcode secret can be problematic. We introduce SwipeID, a method of identifying supervisor users across a set of touch-based devices by correlating data from a wrist-worn inertial measurement unit (IMU) and a corresponding touchscreen interaction. This approach naturally supports access at the time and point of contact and does not require any additional hardware on the client devices. We describe the design of our system and the challenge-response protocols we have considered. We then present an evaluation study to demonstrate feasibility. Finally we highlight the potential for our scheme to extend to different application domains and input devices.

ConductAR: an augmented reality based tool for iterative design of conductive ink circuits DIY tools and strategies / Narumi, Koya / Hodges, Steve / Kawahara, Yoshihiro Proceedings of the 2015 International Conference on Ubiquitous Computing 2015-09-07 p.791-800
ACM Digital Library Link
Summary: Recent advances in materials science have resulted in a range of commercially viable and easy-to-use conductive inks which novices, hobbyists, educators, students and researchers are now using to design and build interactive circuits quickly. Despite the ease with which practitioners can construct working circuits, one of the major limitations of designing circuits on-the-fly is the difficulty of detecting and understanding errors in prototype circuits. As well as short- and open-circuits, which often prevent a circuit from working at all, more subtle issues like high resistance traces can result in poor performance. Many users can't readily work out how to successfully modify their circuits, and they often don't have the tools or expertise to measure the relevant circuit parameters. In this paper we present ConductAR, a tool which can recognize and analyze hand-drawn, printed and hybrid conductive ink patterns. An on-screen augmented reality style interaction helps users to understand and enhance circuit operation. A key element of ConductAR is its ability to calculate the resistance of a circuit using a camera attached to an off-the-shelf PC or tablet. Our sparse coding technique is fast enough to support rapid iterative prototyping on real circuits using a conductive ink marker and/or eraser as shown in Figure 1. The system thereby enhances the feasibility of circuit prototyping with conductive ink.

Circuit Eraser: A Tool for Iterative Design with Conductive Ink WIP Theme: Users and UI Design / Narumi, Koya / Shi, Xinyang / Hodges, Steve / Kawahara, Yoshihiro / Shimizu, Shinya / Asami, Tohru Extended Abstracts of the ACM CHI'15 Conference on Human Factors in Computing Systems 2015-04-18 v.2 p.2307-2312
ACM Digital Library Link
Summary: Recent advances in materials science have resulted in a range of commercially viable and easy-to-use conductive inks which many practitioners are now using for the rapid design and realization of interactive circuits. Despite the ease with which hobbyists, educators and researchers can construct working circuits, a major limitation of prototyping with conductive ink is the difficulty of altering a design which has already been printed, and in particular removing areas of ink. In this paper we present Circuit Eraser, a simple yet effective tool which enables users to 'delete' existing conductive patterns. Through experimentation we have found an effective combination of materials which result in the removal of only the thin surface layer composed of ink particles, with minimal damage to the surface coating of the paper. This important characteristic ensures it is possible to re-apply conductive ink as part of an on-going design iteration. In addition to a lab-based evaluation of our Circuit Eraser which we present here, we have also used our technique in several practical applications and we illustrate one of these, namely the iterative design of a radio-frequency antenna.

PrintSense: a versatile sensing technique to support multimodal flexible surface interaction On and above the surface / Gong, Nan-Wei / Steimle, Jürgen / Olberding, Simon / Hodges, Steve / Gillian, Nicholas Edward / Kawahara, Yoshihiro / Paradiso, Joseph A. Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.1 p.1407-1410
ACM Digital Library Link
Summary: We present a multimodal on-surface and near-surface sensing technique for planar, curved and flexible surfaces. Our technique leverages temporal multiplexing of signals coming from a universal interdigitated electrode design, which is printed as a single conductive layer on a flexible substrate. It supports sensing of touch and proximity input, and moreover is capable of capturing several levels of pressure and flexing. We leverage recent developments in conductive inkjet printing as a way to prototype electrode patterns, and combine this with our hardware module for supporting the full range of sensing methods. As the technique is low-cost and easy to implement, it is particularly well-suited for prototyping touch- and hover-based user interfaces, including curved and deformable ones.

Circuit stickers: peel-and-stick construction of interactive electronic prototypes DIY and hacking / Hodges, Steve / Villar, Nicolas / Chen, Nicholas / Chugh, Tushar / Qi, Jie / Nowacka, Diana / Kawahara, Yoshihiro Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.1 p.1743-1746
ACM Digital Library Link
Summary: We present a novel approach to the construction of electronic prototypes which can support a variety of interactive devices. Our technique, which we call circuit stickers, involves adhering physical interface elements such as LEDs, sounders, buttons and sensors onto a cheap and easy-to-make substrate which provides electrical connectivity. This assembly may include control electronics and a battery for standalone operation, or it can be interfaced to a microcontroller or PC. In this paper we illustrate different points in the design space and demonstrate the technical feasibility of our approach. We have found circuit stickers to be versatile and low-cost, supporting quick and easy construction of physically flexible interactive prototypes. Building extra copies of a device is straightforward. We believe this technology has potential for design exploration, research proto-typing, education and for hobbyist projects.

Making 3D printed objects interactive using wireless accelerometers Works-in-progress / Hook, Jonathan / Nappey, Thomas / Hodges, Steve / Wright, Peter / Olivier, Patrick Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems 2014-04-26 v.2 p.1435-1440
ACM Digital Library Link
Summary: We present an approach that allows designers and others to quickly and easily make 3D printed objects interactive, without the need for hardware or software expertise and with little modification to an object's physical design. With our approach, a designer simply attaches or embeds small three-axis wireless accelerometer modules into the moving parts of a 3D printed object. A simple graphical user interface is then used to configure the system to interpret the movements of these accelerometers as if they were common physical controls such as buttons or dials. The designer can then associate events generated by these controls with a range of interactive behavior, including web browser and media player control.

Instant inkjet circuits: lab-based inkjet printing to support rapid prototyping of UbiComp devices Hardware / Kawahara, Yoshihiro / Hodges, Steve / Cook, Benjamin S. / Zhang, Cheng / Abowd, Gregory D. Proceedings of the 2013 International Joint Conference on Pervasive and Ubiquitous Computing 2013-09-08 v.1 p.363-372
ACM Digital Library Link
Summary: This paper introduces a low cost, fast and accessible technology to support the rapid prototyping of functional electronic devices. Central to this approach of 'instant inkjet circuits' is the ability to print highly conductive traces and patterns onto flexible substrates such as paper and plastic films cheaply and quickly. In addition to providing an alternative to breadboarding and conventional printed circuits, we demonstrate how this technique readily supports large area sensors and high frequency applications such as antennas. Unlike existing methods for printing conductive patterns, conductivity emerges within a few seconds without the need for special equipment. We demonstrate that this technique is feasible using commodity inkjet printers and commercially available ink, for an initial investment of around US$300. Having presented this exciting new technology, we explain the tools and techniques we have found useful for the first time. Our main research contribution is to characterize the performance of instant inkjet circuits and illustrate a range of possibilities that are enabled by way of several example applications which we have built. We believe that this technology will be of immediate appeal to researchers in the ubiquitous computing domain, since it supports the fabrication of a variety of functional electronic device prototypes.

An interactive belt-worn badge with a retractable string-based input mechanism Papers: displays everywhere / Pohl, Norman / Hodges, Steve / Helmes, John / Villar, Nicolas / Paek, Tim Proceedings of ACM CHI 2013 Conference on Human Factors in Computing Systems 2013-04-27 v.1 p.1465-1468
ACM Digital Library Link
Summary: In this paper we explore a new type of wearable computing device, an interactive identity badge. An embedded LCD presents dynamic information to the wearer and interaction is facilitated by sensing movement of the retractable string which attaches the unit to the wearer's belt. This form-factor makes it possible to interact using a single hand, providing lightweight and immediate access to a variety of information when it's not convenient to pick up, unlock and interact directly with a device like a smartphone. In this paper we present our prototype interactive badge, demonstrate the underlying technology and describe a number of usage scenarios and interaction techniques.

Exploring physical prototyping techniques for functional devices using .NET gadgeteer Demos / Hodges, Steve / Taylor, Stuart / Villar, Nicolas / Scott, James / Helmes, John Proceedings of the 2013 International Conference on Tangible and Embedded Interaction 2013-02-10 2013-02-10 p.271-274
ACM Digital Library Link
Summary: In this paper we present a number of different physical construction techniques for prototyping functional electronic devices. Some of these approaches are already well established whilst others are more novel; our aim is to briefly summarize some of the main categories and to illustrate them with real examples. Whilst a number of different tools exist for building working device prototypes, for consistency the examples we present here are all built using the Microsoft .NET Gadgeteer platform. Although this naturally constrains the scope of this study, it also facilitates a basic comparison of the different techniques. Our ultimate aim is to enable others in the field to learn from our experiences and the techniques we present.

Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces HCI / Molyneaux, David / Izadi, Shahram / Kim, David / Hilliges, Otmar / Hodges, Steve / Cao, Xiang / Butler, Alex / Gellersen, Hans Proceedings of Pervasive 2012: International Conference on Pervasive Computing 2012-06-18 p.197-215
Keywords: Handheld projection; geometry and spatial awareness; interaction
Link to Digital Content at Springer
Summary: This paper presents two novel handheld projector systems for indoor pervasive computing spaces. These projection-based devices are "aware" of their environment in ways not demonstrated previously. They offer both spatial awareness, where the system infers location and orientation of the device in 3D space, and geometry awareness, where the system constructs the 3D structure of the world around it, which can encompass the user as well as other physical objects, such as furniture and walls. Previous work in this area has predominantly focused on infrastructure-based spatial-aware handheld projection and interaction. Our prototypes offer greater levels of environment awareness, but achieve this using two opposing approaches; the first infrastructure-based and the other infrastructure-less sensing. We highlight a series of interactions including direct touch, as well as in-air gestures, which leverage the shadow of the user for interaction. We describe the technical challenges in realizing these novel systems; and compare them directly by quantifying their location tracking and input sensing capabilities.

.NET Gadgeteer: A Platform for Custom Devices Development Tools and Devices / Villar, Nicolas / Scott, James / Hodges, Steve / Hammil, Kerry / Miller, Colin Proceedings of Pervasive 2012: International Conference on Pervasive Computing 2012-06-18 p.216-233
Link to Digital Content at Springer
Summary: .NET Gadgeteer is a new platform conceived to make it easier to design and build custom electronic devices and systems for a range of ubiquitous and mobile computing scenarios. It consists of three main elements: solder-less modular electronic hardware; object-oriented managed software libraries accessed using a high-level programming language and established development environment; and 3D design and construction tools designed to facilitate a great deal of control over the form factor of the resulting electronic devices. Each of these elements is designed to be accessible to a wide range of people with varying backgrounds and levels of experience and at the same time provide enough flexibility to allow experts to build relatively sophisticated devices and complex systems in less time than they are used to. In this paper we describe the .NET Gadgeteer system in detail for the first time, explaining a number of key design decisions and reporting on its use by new users and experts alike.

Shake'n'sense: reducing interference for overlapping structured light depth cameras Sensory interaction modalities / Butler, D. Alex / Izadi, Shahram / Hilliges, Otmar / Molyneaux, David / Hodges, Steve / Kim, David Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems 2012-05-05 v.1 p.1933-1936
ACM Digital Library Link
Summary: We present a novel yet simple technique that mitigates the interference caused when multiple structured light depth cameras point at the same part of a scene. The technique is particularly useful for Kinect, where the structured light source is not modulated. Our technique requires only mechanical augmentation of the Kinect, without any need to modify the internal electronics, firmware or associated host software. It is therefore simple to replicate. We show qualitative and quantitative results highlighting the improvements made to interfering Kinect depth signals. The camera frame rate is not compromised, which is a problem in approaches that modulate the structured light source. Our technique is non-destructive and does not impact depth values or geometry. We discuss uses for our technique, in particular within instrumented rooms that require simultaneous use of multiple overlapping fixed Kinect cameras to support whole room interactions.

KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera 3D / Izadi, Shahram / Kim, David / Hilliges, Otmar / Molyneaux, David / Newcombe, Richard / Kohli, Pushmeet / Shotton, Jamie / Hodges, Steve / Freeman, Dustin / Davison, Andrew / Fitzgibbon, Andrew Proceedings of the 201 ACM Symposium on User Interface Software and Technology1 2011-10-16 v.1 p.559-568
ACM Digital Library Link
Summary: KinectFusion enables a user holding and moving a standard Kinect camera to rapidly create detailed 3D reconstructions of an indoor scene. Only the depth data from Kinect is used to track the 3D pose of the sensor and reconstruct, geometrically precise, 3D models of the physical scene in real-time. The capabilities of KinectFusion, as well as the novel GPU-based pipeline are described in full. Uses of the core system for low-cost handheld scanning, and geometry-aware augmented reality and physics-based interactions are shown. Novel extensions to the core GPU pipeline demonstrate object segmentation and user interaction directly in front of the sensor, without degrading camera tracking or reconstruction. These extensions are used to enable real-time multi-touch interactions anywhere, allowing any planar or non-planar reconstructed physical surface to be appropriated for touch.

Vermeer: direct interaction with a 360° viewable 3D display 3D / Butler, Alex / Hilliges, Otmar / Izadi, Shahram / Hodges, Steve / Molyneaux, David / Kim, David / Kong, Danny Proceedings of the 201 ACM Symposium on User Interface Software and Technology1 2011-10-16 v.1 p.569-576
ACM Digital Library Link
Summary: We present Vermeer, a novel interactive 360° viewable 3D display. Like prior systems in this area, Vermeer provides viewpoint-corrected, stereoscopic 3D graphics to simultaneous users, 360° around the display, without the need for eyewear or other user instrumentation. Our goal is to over-come an issue inherent in these prior systems which -- typically due to moving parts -- restrict interactions to outside the display volume. Our system leverages a known optical illusion to demonstrate, for the first time, how users can reach into and directly touch 3D objects inside the display volume. Vermeer is intended to be a new enabling technology for interaction, and we therefore describe our hardware implementation in full, focusing on the challenges of combining this optical configuration with an existing approach for creating a 360° viewable 3D display. Initially we demonstrate direct involume interaction by sensing user input with a Kinect camera placed above the display. However, by exploiting the properties of the optical configuration, we also demonstrate novel prototypes for fully integrated input sensing alongside simultaneous display. We conclude by discussing limitations, implications for interaction, and ideas for future work.

Leveraging conductive inkjet technology to build a scalable and versatile surface for ubiquitous sensing Novel ubiquitous technologies / Gong, Nan-Wei / Hodges, Steve / Paradiso, Joseph A. Proceedings of the 2011 International Conference on Ubiquitous Computing 2011-09-17 p.45-54
ACM Digital Library Link
Summary: In this paper we describe the design and implementation of a new versatile, scalable and cost-effective sensate surface. The system is based on a new conductive inkjet technology, which allows capacitive sensor electrodes and different types of RF antennas to be cheaply printed onto a roll of flexible substrate that may be many meters long. By deploying this surface on (or under) a floor it is possible to detect the presence and whereabouts of users through both passive and active capacitive coupling schemes. We have also incorporated GSM and NFC electromagnetic radiation sensing and piezoelectric pressure and vibration detection. We report on a number of experiments which evaluate sensing performance based on a 2.5m x 0.3m hardware test-bed. We describe some potential applications for this technology and highlight a number of improvements we have in mind.

PreHeat: controlling home heating using occupancy prediction Home and away / Scott, James / Brush, A. J. Bernheim / Krumm, John / Meyers, Brian / Hazas, Michael / Hodges, Stephen / Villar, Nicolas Proceedings of the 2011 International Conference on Ubiquitous Computing 2011-09-17 p.281-290
ACM Digital Library Link
Summary: Home heating is a major factor in worldwide energy use. Our system, PreHeat, aims to more efficiently heat homes by using occupancy sensing and occupancy prediction to automatically control home heating. We deployed PreHeat in five homes, three in the US and two in the UK. In UK homes, we controlled heating on a per-room basis to enable further energy savings. We compared PreHeat's prediction algorithm with a static program over an average 61 days per house, alternating days between these conditions, and measuring actual gas consumption and occupancy. In UK homes PreHeat both saved gas and reduced MissTime (the time that the house was occupied but not warm). In US homes, PreHeat decreased MissTime by a factor of 6-12, while consuming a similar amount of gas. In summary, PreHeat enables more efficient heating while removing the need for users to program thermostat schedules.

Interactive generator: a self-powered haptic feedback device Touch 1: tactile & haptics / Badshah, Akash / Gupta, Sidhant / Cohn, Gabe / Villar, Nicolas / Hodges, Steve / Patel, Shwetak N. Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems 2011-05-07 v.1 p.2051-2054
ACM Digital Library Link
Summary: We present Interactive Generator (InGen), a self-powered wireless rotary input device capable of generating haptic or force feedback without the need for any external power source. Our approach uses a modified servomotor to perform three functions: (1) generating power for wireless communication and embedded electronics, (2) sensing the direction and speed of rotation, and (3) providing force feedback during rotation. While InGen is rotating, the device is capable of providing the sensation of detents or bumps, changes in stiffness, and abrupt stops using only power that is harvested during interaction. We describe the device in detail, demonstrate an initial 'TV remote control' application, and end with a discussion of our experiences developing the prototype and application. To the best of our knowledge, InGen is the first self-powered device, which also provides haptic feedback during operation. More broadly, this work demonstrates a new class of input systems that uses human-generated power to provide feedback to the user and wirelessly communicate sensed information.

Prototyping with Microsoft .NET gadgeteer Studios and workshops / Villar, Nicolas / Scott, James / Hodges, Steve Proceedings of the 5th International Conference on Tangible and Embedded Interaction 2011-01-22 p.377-380
ACM Digital Library Link
Summary: Microsoft .NET Gadgeteer is a new prototyping platform that makes it easier to construct, program and shape new kinds of computing objects. It is comprised of modular hardware, software libraries and 3D CAD support. Together, these elements support the key activities involved both in the rapid prototyping and the small-scale production of custom embedded, interactive and connected devices. We propose to organize and run a studio at TEI 2010 where participants are introduced to the platform and its capabilities. Participants will work in groups, assembling electronic modules, writing software and designing a case or enclosure for their device. The end-result will be that each group develops a fully functional device, which can be exhibited at the TEI demo session.

The peppermill: a human-powered user interface device Bridging the physical and digital worlds / Villar, Nicolas / Hodges, Steve Proceedings of the 4th International Conference on Tangible and Embedded Interaction 2010-01-24 p.29-32
Keywords: human-powered electronics, input devices
ACM Digital Library Link
Summary: A human-powered user interface device sources its power from the physical effort required to operate it. This paper describes a technique by which a geared DC motor and a simple circuit can be used to enable interaction-powered rotary input devices. When turned, the circuit provides a temporary power source for an embedded device, and doubles as a sensor that provides information about the direction and rate of input. As a proof of concept, we have developed a general-purpose wireless input device -- called the Peppermill -- and illustrate its capabilities by using it as a remote control for a multimedia-browsing application.

The peppermill: a human-powered user interface device Demonstrations / Villar, Nicolas / Hodges, Steve Proceedings of the 4th International Conference on Tangible and Embedded Interaction 2010-01-24 p.29-32
Keywords: human-powered electronics, input devices
ACM Digital Library Link
Summary: A human-powered user interface device sources its power from the physical effort required to operate it. This paper describes a technique by which a geared DC motor and a simple circuit can be used to enable interaction-powered rotary input devices. When turned, the circuit provides a temporary power source for an embedded device, and doubles as a sensor that provides information about the direction and rate of input. As a proof of concept, we have developed a general-purpose wireless input device -- called the Peppermill -- and illustrate its capabilities by using it as a remote control for a multimedia-browsing application.

Mouse 2.0: multi-touch meets the mouse Hold me, squeeze me / Villar, Nicolas / Izadi, Shahram / Rosenfeld, Dan / Benko, Hrvoje / Helmes, John / Westhues, Jonathan / Hodges, Steve / Ofek, Eyal / Butler, Alex / Cao, Xiang / Chen, Billy Proceedings of the 2009 ACM Symposium on User Interface Software and Technology 2009-10-04 p.33-42
Keywords: desktop computing, input devices, mouse, multi-touch, novel hardware, surface computing
ACM Digital Library Link
Summary: In this paper we present novel input devices that combine the standard capabilities of a computer mouse with multi-touch sensing. Our goal is to enrich traditional pointer-based desktop interactions with touch and gestures. To chart the design space, we present five different multi-touch mouse implementations. Each explores a different touch sensing strategy, which leads to differing form-factors and hence interactive possibilities. In addition to the detailed description of hardware and software implementations of our prototypes, we discuss the relative strengths, limitations and affordances of these novel input devices as informed by the results of a preliminary user study.

Interactions in the air: adding further depth to interactive tabletops Waiter, can you please bring me a fork? / Hilliges, Otmar / Izadi, Shahram / Wilson, Andrew D. / Hodges, Steve / Garcia-Mendoza, Armando / Butz, Andreas Proceedings of the 2009 ACM Symposium on User Interface Software and Technology 2009-10-04 p.139-148
Keywords: 3D, 3D graphics, computer vision, depth-sensing cameras, holoscreen, interactive surfaces, surfaces, switchable diffusers, tabletop
ACM Digital Library Link
Summary: Although interactive surfaces have many unique and compelling qualities, the interactions they support are by their very nature bound to the display surface. In this paper we present a technique for users to seamlessly switch between interacting on the tabletop surface to above it. Our aim is to leverage the space above the surface in combination with the regular tabletop display to allow more intuitive manipulation of digital content in three-dimensions. Our goal is to design a technique that closely resembles the ways we manipulate physical objects in the real-world; conceptually, allowing virtual objects to be 'picked up' off the tabletop surface in order to manipulate their three dimensional position or orientation. We chart the evolution of this technique, implemented on two rear projection-vision tabletops. Both use special projection screen materials to allow sensing at significant depths beyond the display. Existing and new computer vision techniques are used to sense hand gestures and postures above the tabletop, which can be used alongside more familiar multi-touch interactions. Interacting above the surface in this way opens up many interesting challenges. In particular it breaks the direct interaction metaphor that most tabletops afford. We present a novel shadow-based technique to help alleviate this issue. We discuss the strengths and limitations of our technique based on our own observations and initial user feedback, and provide various insights from comparing, and contrasting, our tabletop implementations.
<<First <Previous Permalink Next> Last>> Records: 1 to 25 of 33 Jump to: 2016 | 15 | 14 | 13 | 12 | 11 | 10 | 09 | 08 | 07 | 06 | 94 |