Project Jacquard: Interactive Digital Textiles at Scale
Everyday Objects as Interaction Surfaces
/
Poupyrev, Ivan
/
Gong, Nan-Wei
/
Fukuhara, Shiho
/
Karagozler, Mustafa Emre
/
Schwesig, Carsten
/
Robinson, Karen E.
Proceedings of the ACM CHI'16 Conference on Human Factors in Computing
Systems
2016-05-07
v.1
p.4216-4227
© Copyright 2016 ACM
Summary: Project Jacquard presents manufacturing technologies that enable deploying
invisible ubiquitous interactivity at scale. We propose novel interactive
textile materials that can be manufactured inexpensively using existing textile
weaving technology and equipment.
The development of touch-sensitive textiles begins with the design and
engineering of a new highly conductive yarn. The yarns and textiles can be
produced by standard textile manufacturing processes and can be dyed to any
color, made with a number of materials, and designed to a variety of
thicknesses and textures to be consistent with garment designers' needs.
We describe the development of yarn, textiles, garments, and user
interactivity; we present the opportunities and challenges of creating a
manufacturable interactive textile for wearable computing.
"I don't Want to Wear a Screen": Probing Perceptions of and Possibilities
for Dynamic Displays on Clothing
Body and Fashion
/
Devendorf, Laura
/
Lo, Joanne
/
Howell, Noura
/
Lee, Jung Lin
/
Gong, Nan-Wei
/
Karagozler, M. Emre
/
Fukuhara, Shiho
/
Poupyrev, Ivan
/
Paulos, Eric
/
Ryokai, Kimiko
Proceedings of the ACM CHI'16 Conference on Human Factors in Computing
Systems
2016-05-07
v.1
p.6028-6039
© Copyright 2016 ACM
Summary: This paper explores the role dynamic textile displays play in relation to
personal style: What does it mean to wear computationally responsive clothing
and why would one be motivated to do so? We developed a novel textile display
technology, called Ebb, and created several woven and crochet fabric swatches
that explored clothing-specific design possibilities. We engaged fashion
designers and non-designers in imagining how Ebb would integrate into their
design practice or personal style of dressing. Participants evaluated the
appeal and utility of clothing-based displays according to a very different set
of criteria than traditional screen-based computational displays. Specifically,
the slowness, low-resolution, and volatility of Ebb tended to be seen as assets
as opposed to technical limitations in the context of personal style.
Additionally, participants envisioned various ways that ambiguous, ambient, and
abstract displays of information could prompt new experiences in their everyday
lives. Our paper details the complex relationships between display and personal
style and offers a new design metaphor and extension of Gaver et al.'s original
descriptions of ambiguity in order to guide the design of clothing-based
displays for everyday life.
Demo hour
Demo hour
/
Karagozler, M. Emre
/
Poupyrev, Ivan
/
Fedder, Gary K.
/
Suzuki, Yuri
/
Yao, Lining
/
Niiyama, Ryuma
/
Ou, Jifei
/
Follmer, Sean
/
Ishii, Hiroshi
/
Brosz, John
/
Nacenta, Miguel A.
/
Pusch, Richard
/
Carpendale, Sheelagh
/
Hurter, Christophe
/
Rekimoto, Jun
interactions
2014-05
v.21
n.3
p.6-9
© Copyright 2014 ACM
Summary: UIST is a premier forum for innovations in the software and hardware of
human-computer interfaces. The UIST demo program enables attendees to
experience firsthand the most interesting next-generation user interface
technologies. The UIST 2013 demo program featured technologies ranging from
energy-harvesting interactive paper to pneumatically actuated materials,
providing attendees a vivid preview of some of the interactive systems that
might shape our daily lives in the future. -- Per Ola Kristensson and T. Scott
Saponas, UIST 2013 Demo Chairs
3D printed interactive speakers
DIY and hacking
/
Ishiguro, Yoshio
/
Poupyrev, Ivan
Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems
2014-04-26
v.1
p.1733-1742
© Copyright 2014 ACM
Summary: We propose technology for designing and manufacturing interactive 3D printed
speakers. With the proposed technology, sound reproduction can easily be
integrated into various objects at the design stage and little assembly is
required. The speaker can take the shape of anything from an abstract spiral to
a rubber duck, opening new opportunities in product design. Furthermore, both
audible sound and inaudible ultrasound can be produced with the same design,
allowing for identifying and tracking 3D printed objects in space using common
integrated microphones. The design of 3D printed speakers is based on
electrostatic loudspeaker technology first explored in the early 1930s but not
broadly applied until now. These speakers are simpler than common
electromagnetic speakers, while allowing for sound reproduction at 60 dB levels
with arbitrary directivity ranging from focused to omnidirectional. Our
research of 3D printed speakers contributes to the growing body of work
exploring functional 3D printing in interactive applications.
Paper generators: harvesting energy from touching, rubbing and sliding
Video showcase presentations
/
Dauner, Joanna Maria
/
Karagozler, Mustafa Emre
/
Poupyrev, Ivan
Proceedings of ACM CHI 2014 Conference on Human Factors in Computing Systems
2014-04-26
v.2
p.161-162
© Copyright 2014 ACM
Summary: We present a new energy harvesting technology that generates electrical
energy from a user's interaction with paper-like materials. The energy
harvesters are flexible, light, and inexpensive, and they utilize a user's
gestures such as tapping, touching, rubbing and sliding to generate electrical
energy.
The harvested energy is then used to actuate LEDs, e-paper displays and
various other devices to create novel interactive applications, such as
enhancing books and other printed media with interactivity.
AIREAL: tactile interactive experiences in free air
Adjunct 2: sponsor demonstrations
/
Sodhi, Rajinder
/
Glisson, Matthew
/
Poupyrev, Ivan
Adjunct Proceedings of the 2013 ACM Symposium on User Interface Software and
Technology
2013-10-08
v.2
p.25-26
© Copyright 2013 ACM
Summary: AIREAL is a novel haptic technology that delivers effective and expressive
tactile sensations in free air, without requiring the user to wear a physical
device. Combined with interactive computers graphics, AIREAL enables users to
feel virtual 3D objects, experience free air textures and receive haptic
feedback on gestures performed in free space. AIREAL relies on air vortex
generation directed by an actuated flexible nozzle to provide effective tactile
feedback with a 75 degrees field of view, and within an 8.5cm resolution at 1
meter. AIREAL is a scalable, inexpensive and practical free air haptic
technology that can be used in a broad range of applications, including gaming,
mobile applications, and gesture interaction among many others. This paper
reports the details of the AIREAL design and control, experimental evaluations
of the device's performance, as well as an exploration of the application space
of free air haptic displays. Although we used vortices, we believe that the
results reported are generalizable and will inform the design of haptic
displays based on alternative principles of free air tactile actuation.
Lumitrack: low cost, high precision, high speed tracking with projected
m-sequences
Hardware
/
Xiao, Robert
/
Harrison, Chris
/
Willis, Karl D. D.
/
Poupyrev, Ivan
/
Hudson, Scott E.
Proceedings of the 2013 ACM Symposium on User Interface Software and
Technology
2013-10-08
v.1
p.3-12
© Copyright 2013 ACM
Summary: We present Lumitrack, a novel motion tracking technology that uses projected
structured patterns and linear optical sensors. Each sensor unit is capable of
recovering 2D location within the projection area, while multiple sensors can
be combined for up to six degree of freedom (DOF) tracking. Our structured
light approach is based on special patterns, called m-sequences, in which any
consecutive sub-sequence of m bits is unique. Lumitrack can utilize both
digital and static projectors, as well as scalable embedded sensing
configurations. The resulting system enables high-speed, high precision, and
low-cost motion tracking for a wide range of interactive applications. We
detail the hardware, operation, and performance characteristics of our
approach, as well as a series of example applications that highlight its
immediate feasibility and utility.
Paper generators: harvesting energy from touching, rubbing and sliding
Hardware
/
Karagozler, Mustafa Emre
/
Poupyrev, Ivan
/
Fedder, Gary K.
/
Suzuki, Yuri
Proceedings of the 2013 ACM Symposium on User Interface Software and
Technology
2013-10-08
v.1
p.23-30
© Copyright 2013 ACM
Summary: We present a new energy harvesting technology that generates electrical
energy from a user's interactions with paper-like materials. The energy
harvesters are flexible, light, and inexpensive, and they utilize a user's
gestures such as tapping, touching, rubbing and sliding to generate electrical
energy. The harvested energy is then used to actuate LEDs, e-paper displays and
various other devices to create novel interactive applications, such as
enhancing books and other printed media with interactivity.
PAPILLON: designing curved display surfaces with printed optics
Tangible and fabrication
/
Brockmeyer, Eric
/
Poupyrev, Ivan
/
Hudson, Scott
Proceedings of the 2013 ACM Symposium on User Interface Software and
Technology
2013-10-08
v.1
p.457-462
© Copyright 2013 ACM
Summary: We present a technology for designing curved display surfaces that can both
display information and sense two dimensions of human touch. It is based on 3D
printed optics, where the surface of the display is constructed as a bundle of
printed light pipes, that direct images from an arbitrary planar image source
to the surface of the display. This effectively decouples the display surface
and image source, allowing to iterate the design of displays without requiring
changes to the complex electronics and optics of the device. In addition, the
same optical elements also direct light from the surface of the display back to
the image sensor allowing for touch input and proximity detection of a hand
relative to the display surface. The resulting technology is effective in
designing compact, efficient displays of a small size; this has been applied in
the design of interactive animated eyes.
Tactile rendering of 3D features on touch surfaces
Haptics
/
Kim, Seung-Chan
/
Israr, Ali
/
Poupyrev, Ivan
Proceedings of the 2013 ACM Symposium on User Interface Software and
Technology
2013-10-08
v.1
p.531-538
© Copyright 2013 ACM
Summary: We present a tactile-rendering algorithm for simulating 3D geometric
features, such as bumps, on touch screen surfaces. This is achieved by
modulating friction forces between the user's finger and the touch screen,
instead of physically moving the touch surface. We proposed that the percept of
a 3D bump is created when local gradients of the rendered virtual surface are
mapped to lateral friction forces. To validate this approach, we first
establish a psychophysical model that relates the perceived friction force to
the controlled voltage applied to the tactile feedback device. We then use this
model to demonstrate that participants are three times more likely to prefer
gradient force profiles than other commonly used rendering profiles. Finally,
we present a generalized algorithm and conclude the paper with a set of
applications using our tactile rendering technology.
Revel: programming the sense of touch
Video showcase presentations
/
Bau, Olivier
/
Poupyrev, Ivan
/
Goc, Mathieu Le
/
Galliot, Laureline
/
Glisson, Matthew
Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing
Systems
2013-04-27
v.2
p.2785-2786
© Copyright 2013 ACM
Summary: Revel is a new wearable tactile technology that modifies the user's tactile
perception of the physical world. Current tactile technologies enhance objects
and devices with various actuators to create rich tactile sensations, limiting
the experience to the interaction with instrumented devices. In contrast, REVEL
can add artificial tactile sensations to almost any surface or object with very
little if any instrumentation of the environment. As a result, REVEL can
provide dynamic tactile sensations on touch screens as well as everyday objects
and surfaces in the environment, such as furniture, walls, wooden and plastic
objects, and even human skin. Revel can be used in many new and exciting
applications, including adding tactile feedback to projected content, enhancing
the environment with tactile guidance for the visually impaired or providing
personal tactile feedback for multi-user touch surfaces.
Displays take new shape: an agenda for future interactive surfaces
Workshop summaries
/
Steimle, Jürgen
/
Benko, Hrvoje
/
Cassinelli, Alvaro
/
Ishii, Hiroshi
/
Leithinger, Daniel
/
Maes, Pattie
/
Poupyrev, Ivan
Extended Abstracts of ACM CHI'13 Conference on Human Factors in Computing
Systems
2013-04-27
v.2
p.3283-3286
© Copyright 2013 ACM
Summary: This workshop provides a forum for discussing emerging trends in interactive
surfaces that leverage alternative display types and form factors to enable
more expressive interaction with information. The goal of the workshop is to
push the current discussion forward towards a synthesis of emerging
visualization and interaction concepts in the area of improvised, minimal,
curved and malleable interactive surfaces. By doing so, we aim to generate an
agenda for future research and development in interactive surfaces.
Infusing the physical world into user interfaces
Keynote 2
/
Poupyrev, Ivan
Proceedings of the 2012 International Conference on Multimodal Interfaces
2012-10-22
p.229-230
© Copyright 2012 ACM
Summary: Advances in new materials and manufacturing techniques are rapidly blending
the computational and physical worlds. With every new turn in technology
development -- e.g., discovering a novel "smart" material, inventing a more
efficient manufacturing process or designing a faster microprocessor -- there
are new and exciting ways to take user interfaces away from the screen and
blend them into our living spaces and everyday objects, making them more
responsive, intelligent and adaptive. As the world around us becomes
increasingly infused with technology, the user interfaces and computers
themselves will disappear into the background, blending into the physical world
around us. Thus, the old tried-and-true paradigms for designing interaction and
interfaces must be re-evaluated, re-designed and, in some cases, even discarded
to take advantage of the new possibilities that these cutting-edge technologies
provide. While the challenges and opportunities are distinct, the fundamental
goal remains the same: to provide for the effortless and effective consumption,
control and transmission of information at any time and in any place, while
delivering a unique experience that is only possible with these emerging
technologies.
In this talk I will present work produced by myself and the research group
that I have been directing at Disney Research Pittsburgh. We are addressing
these exciting challenges. This talk will cover projects investigating tactile
and haptics interfaces, deformable computing devices, augmented reality
interfaces and novel touch sensing techniques, as well as biologically-inspired
interfaces, among others. The presentation will cover both projects conducted
while at Sony Corporation and more recent research efforts in the Interaction
Group at Walt Disney Research, Pittsburgh.
Capacitive fingerprinting: exploring user differentiation by sensing
electrical properties of the human body
Tactile & grip
/
Harrison, Chris
/
Sato, Munehiko
/
Poupyrev, Ivan
Proceedings of the 2012 ACM Symposium on User Interface Software and
Technology
2012-10-07
v.1
p.537-544
© Copyright 2012 ACM
Summary: At present, touchscreens can differentiate multiple points of contact, but
not who is touching the device. In this work, we consider how the electrical
properties of humans and their attire can be used to support user
differentiation on touchscreens. We propose a novel sensing approach based on
Swept Frequency Capacitive Sensing, which measures the impedance of a user to
the environment (i.e., ground) across a range of AC frequencies. Different
people have different bone densities and muscle mass, wear different footwear,
and so on. This, in turn, yields different impedance profiles, which allows for
touch events, including multitouch gestures, to be attributed to a particular
user. This has many interesting implications for interactive design. We
describe and evaluate our sensing approach, demonstrating that the technique
has considerable promise. We also discuss limitations, how these might be
overcome, and next steps.
Printed optics: 3D printing of embedded optical elements for interactive
devices
Fabrication & hardware
/
Willis, Karl
/
Brockmeyer, Eric
/
Hudson, Scott
/
Poupyrev, Ivan
Proceedings of the 2012 ACM Symposium on User Interface Software and
Technology
2012-10-07
v.1
p.589-598
© Copyright 2012 ACM
Summary: We present an approach to 3D printing custom optical elements for
interactive devices labelled Printed Optics. Printed Optics enable sensing,
display, and illumination elements to be directly embedded in the casing or
mechanical structure of an interactive device. Using these elements, unique
display surfaces, novel illumination techniques, custom optical sensors, and
embedded optoelectronic components can be digitally fabricated for rapid, high
fidelity, highly customized interactive devices. Printed Optics is part of our
long term vision for interactive devices that are 3D printed in their entirety.
In this paper we explore the possibilities for this vision afforded by
fabrication of custom optical elements using today's 3D printing technology.
Touché: touch and gesture sensing for the real world
Demos
/
Poupyrev, Ivan
/
Harrison, Chris
/
Sato, Munehiko
Proceedings of the 2012 International Conference on Ubiquitous Computing
2012-09-05
p.536
© Copyright 2012 ACM
Summary: Touché proposes a novel Swept Frequency Capacitive Sensing technique
that can not only detect a touch event, but also recognize complex
configurations of the human hands and body. Such contextual information
significantly enhances touch interaction in a broad range of applications, from
conventional touchscreens to unique contexts and materials. For example, in our
explorations we add touch and gesture sensitivity to the human body and
liquids. We demonstrate the rich capabilities of Touché with five
example setups from different application domains and conduct experimental
studies that show gesture classification accuracies of 99% are achievable with
our technology.
Touché: enhancing touch interaction on humans, screens, liquids, and
everyday objects
Brain & body
/
Sato, Munehiko
/
Poupyrev, Ivan
/
Harrison, Chris
Proceedings of ACM CHI 2012 Conference on Human Factors in Computing Systems
2012-05-05
v.1
p.483-492
© Copyright 2012 ACM
Summary: Touché proposes a novel Swept Frequency Capacitive Sensing technique
that can not only detect a touch event, but also recognize complex
configurations of the human hands and body. Such contextual information
significantly enhances touch interaction in a broad range of applications, from
conventional touchscreens to unique contexts and materials. For example, in our
explorations we add touch and gesture sensitivity to the human body and
liquids. We demonstrate the rich capabilities of Touché with five
example setups from different application domains and conduct experimental
studies that show gesture classification accuracies of 99% are achievable with
our technology.
Surround haptics: tactile feedback for immersive gaming experiences
Interactivity presentations
/
Israr, Ali
/
Kim, Seung-Chan
/
Stec, Jan
/
Poupyrev, Ivan
Extended Abstracts of ACM CHI'12 Conference on Human Factors in Computing
Systems
2012-05-05
v.2
p.1087-1090
© Copyright 2012 ACM
Summary: In this paper we propose an architecture for rendering rich and
high-resolution haptic feedback on the user's body while playing interactive
games. The haptic architecture consists of three main elements, namely, haptic
engine, haptic API/codec, and haptic display. The haptic engine extracts events
from the game, assigns haptic feedback to these events, and sends coded packets
to haptic API/codec. The haptic API/codec translates the coded packets and
computes driving signals based on carefully evaluated algorithms derived from
psychophysical modeling of tactile perception. The driving signals are then
routed to the haptic display embedded with an array of vibratory transducers. A
user feels high resolution and refined tactile sensations on the body through
the display. We have integrated the Surround Haptics system with a driving
simulation game to provide an enjoyable gaming experience.
Tactile feedback on flat surfaces for the visually impaired
Work-in-progress
/
Israr, Ali
/
Bau, Olivier
/
Kim, Seung-Chan
/
Poupyrev, Ivan
Extended Abstracts of ACM CHI'12 Conference on Human Factors in Computing
Systems
2012-05-05
v.2
p.1571-1576
© Copyright 2012 ACM
Summary: In this paper we introduce a mobile, generic, and inexpensive visuo-tactile
sensory substitution device for the visually impaired. The device helps users
to explore the world around them, by pointing it towards objects of the
environment and rendering tactile information to the objects sensed by a
camera. With the help of two visually impaired participants, we conducted three
preliminary experiments and evaluated the performance of the device in
detecting, reaching and exploring tasks. Both participants were able to detect,
explore and reach for a given object of interest in a controlled room setting
using only the tactile information rendered on the flat panel of the device.
The implication of results and future directions for tactile assistive devices
are discussed.
SideBySide: ad-hoc multi-user interaction with handheld projectors
Mobile
/
Willis, Karl D. D.
/
Poupyrev, Ivan
/
Hudson, Scott E.
/
Mahler, Moshe
Proceedings of the 201 ACM Symposium on User Interface Software and
Technology1
2011-10-16
v.1
p.431-440
© Copyright 2011 ACM
Summary: We introduce SideBySide, a system designed for ad-hoc multi-user interaction
with handheld projectors. SideBySide uses device-mounted cameras and hybrid
visible/infrared light projectors to track multiple independent projected
images in relation to one another. This is accomplished by projecting invisible
fiducial markers in the near-infrared spectrum. Our system is completely
self-contained and can be deployed as a handheld device without instrumentation
of the environment. We present the design and implementation of our system
including a hybrid handheld projector to project visible and infrared light,
and techniques for tracking projected fiducial markers that move and overlap.
We introduce a range of example applications that demonstrate the applicability
of our system to real-world scenarios such as mobile content exchange, gaming,
and education.
Motionbeam: a metaphor for character interaction with handheld projectors
Non-flat Displays
/
Willis, Karl D. D.
/
Poupyrev, Ivan
/
Shiratori, Takaaki
Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems
2011-05-07
v.1
p.1031-1040
© Copyright 2011 ACM
Summary: We present the MotionBeam metaphor for character interaction with handheld
projectors. Our work draws from the tradition of pre-cinema handheld projectors
that use direct physical manipulation to control projected imagery. With our
prototype system, users interact and control projected characters by moving and
gesturing with the handheld projector itself. This creates a unified
interaction style where input and output are tied together within a single
device. We introduce a set of interaction principles and present prototype
applications that provide clear examples of the MotionBeam metaphor in use.
Finally we describe observations and insights from a preliminary user study
with our system.
Tactile brush: drawing on skin with a tactile grid display
Touch 1: tactile & haptics
/
Israr, Ali
/
Poupyrev, Ivan
Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems
2011-05-07
v.1
p.2019-2028
© Copyright 2011 ACM
Summary: Tactile Brush is an algorithm that produces smooth, two-dimensional tactile
moving strokes with varying frequency, intensity, velocity and direction of
motion. The design of the algorithm is derived from the results of
psychophysical investigations of two tactile illusions -- apparent tactile
motion and phantom sensations. Combined together they allow for the design of
high-density two-dimensional tactile displays using sparse vibrotactile arrays.
In a series of experiments and evaluations we demonstrate that Tactile Brush is
robust and can reliably generate a wide variety of moving tactile sensations
for a broad range of applications.
Tactile display for the visually impaired using TeslaTouch
Interactivity 1
/
Xu, Cheng
/
Israr, Ali
/
Poupyrev, Ivan
/
Bau, Olivier
/
Harrison, Chris
Proceedings of ACM CHI 2011 Conference on Human Factors in Computing Systems
2011-05-07
v.2
p.317-322
© Copyright 2011 ACM
Summary: TeslaTouch is a technology that provides tactile sensation to moving fingers
on touch screens. Based on TeslaTouch, we have developed applications for the
visually impaired to interpret and create 2D tactile information. In this
paper, we demonstrate these applications, present observations from the
interaction, and discuss TeslaTouch's potential in supporting communication
among visually impaired individuals.
Sensing through structure: designing soft silicone sensors
Sensing and interaction
/
Slyper, Ronit
/
Poupyrev, Ivan
/
Hodgins, Jessica
Proceedings of the 5th International Conference on Tangible and Embedded
Interaction
2011-01-22
p.213-220
© Copyright 2011 ACM
Summary: We present a method for designing and constructing rugged and soft
multi-point sensors. Interactions applied to a soft material are reduced to
structural units of deformation. These structures can then be embedded and
instrumented anywhere inside a soft sensor. This simplification lets us design
complex, durable sensors in easily manufacturable ways. In particular, we
present a construction method of layering electronics between silicone pours to
easily create sensors for arbitrary combinations of these deformations. We
present several prototype sensors and discuss applications including toys,
games, and therapy.
Second international workshop on organic user interfaces
Studios and workshops
/
Girouard, Audrey
/
Vertegaal, Roel
/
Poupyrev, Ivan
Proceedings of the 5th International Conference on Tangible and Embedded
Interaction
2011-01-22
p.381-384
© Copyright 2011 ACM
Summary: Advances in display, sensor and actuator technology are changing the field
of TEI, and opening new research areas. While modern interfaces have been
designed for traditional planar and static display devices, next-generation UI
allow digital objects to change their shape and embed displays anywhere.
Fitting into the paradigm of Organic User Interfaces, these developments
require us to reexamine and reevaluate some of the basic design principles and
interaction styles currently used. This Second International Workshop on
Organic User Interfaces will bring together experts to discuss, brainstorm and
prototype next generation of user interfaces.