HCI Bibliography Home | HCI Conferences | TEI Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
TEI Tables of Contents: 07080910111213141516

Proceedings of the 2013 International Conference on Tangible and Embedded Interaction

Fullname:Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction
Editors:Sergi Jordà; Narcis Parés
Location:Barcelona, Spain
Dates:2013-Feb-10 to 2013-Feb-13
Standard No:ISBN: 978-1-4503-1898-3; ACM DL: Table of Contents; hcibib: TEI13
Links:Conference Website
  1. Physical objective
  2. Learning and education
  3. Material world
  4. Cultural perspectives
  5. Compare and contrast
  6. Getting mobile
  7. Gesture & toolkits
  8. Specific user groups
  9. Demos
  10. Graduate student consortium
  11. Arts track
  12. Studios

Physical objective

SMSlingshot: an expert amateur DIY case study BIBAFull-Text 9-16
  Patrick Tobias Fischer; Eva Hornecker; Christian Zoellner
This paper discusses the design process of VR/Urban's public tangible interface SMSlingshot, a real-time system for urban interventions on Media Façades, which we have exhibited in the last few years around the world. In this case study we investigate how the design collaboration between technologists and industrial designers contributed to the success of the urban intervention. The design process of this 'product' has many DIY aspects, with professional industrial designers and technologists becoming expert amateurs, often dealing with problems that pushed them outside of their professional comfort zone. Don't be afraid of being an amateur!
Ninja track: design of electronic toy variable in shape and flexibility BIBAFull-Text 17-24
  Yuichiro Katsumoto; Satoru Tokuhisa; Masa Inakage
In this paper, we introduce a design for an electronic toy that is variable in shape and flexibility by using a structural object called "Ninja Track," which is a belt-shaped object that consists of ABS parts hinged both longitudinally and transversely. When lying flat, Ninja Track is adequately flexible. When the user folds Ninja Track at the longitudinal hinges, it loses its flexibility and becomes a rigid stick. We have created two types of electronic toys as applications of this structure. During the toy prototyping process, we discovered five interactional considerations for Ninja Track. Whilst showing toys at public exhibitions over a period of two weeks, we discovered a few problems with Ninja Track, and we have implemented solutions to these problems.
Affective touch gesture recognition for a furry zoomorphic machine BIBAFull-Text 25-32
  Anna Flagg; Karon MacLean
Over the last decade, the surprising fact has emerged that machines can possess therapeutic power. Due to the many healing qualities of touch, one route to such power is through haptic emotional interaction, which requires sophisticated touch sensing and interpretation. We explore the development of touch recognition technologies in the context of a furry artificial lap-pet, with the ultimate goal of creating therapeutic interactions by sensing human emotion through touch. In this work, we build upon a previous design for a new type of fur-based touch sensor. Here, we integrate our fur sensor with a piezoresistive fabric location/pressure sensor, and adapt the combined design to cover a curved creature-like object. We then use this interface to collect synchronized time-series data from the two sensors, and perform machine learning analysis to recognize 9 key affective touch gestures. In a study of 16 participants, our model averages 94% recognition accuracy when trained on individuals, and 86% when applied to the combined set of all participants. The model can also recognize which participant is touching the prototype with 79% accuracy. These results promise a new generation of emotionally intelligent machines, enabled by affective touch gesture recognition.
TabletopCars: interaction with active tangible remote controlled cars BIBAFull-Text 33-40
  Chi Tai Dang; Elisabeth André
In this paper, we report on the development of the competitive tangible tabletop game TabletopCars, which combines the virtual world with the physical world. We brought together micro scaled radio controlled cars as active tangibles with an interactive tabletop surface to realize the game. Furthermore, we included Microsoft Kinect depth sensing as an interaction mode for embedded and embodied interaction. Our aim was to investigate the possibilities that emerge through the augmentation capabilities of interactive tabletops for creating novel game concepts and the interaction modes that novel input devices facilitate. This work presents TabletopCars as a testbed for embedded and embodied interaction and describes the system in detail. Finally, we report on a preliminary user study where users controlled the active tangible micro scaled cars through hand gestures.

Learning and education

A multimodal approach to examining 'embodiment' in tangible learning environments BIBAFull-Text 43-50
  Sara Price; Carey Jewitt
Tangible and multitouch technologies offer new opportunities for physically interacting with objects and digital representations, foregrounding the role of the body in interaction and learning. Developing effective methodologies for examining real time cognition and action, and the relationship between embodiment, interaction and learning is challenging. This paper draws on multimodality, with its emphasis on examining the use of multiple semiotic resources for meaning making, to examine the differential use of semiotic resources by pairs of students interacting with a tangible learning environment. Specifically the analysis details the role of body position, gaze, manipulation and speech in shaping interaction. The analysis illustrates the interaction between these modes and 'multimodal action flow', particularly in terms of pace, rhythm and interaction structure, and the implications of this for interaction and the meaning making process.
The digital dream lab: tabletop puzzle blocks for exploring programmatic concepts BIBAFull-Text 51-56
  Hyunjoo Oh; Anisha Deshmane; Feiran Li; Ji Yeon Han; Matt Stewart; Michael Tsai; Xing Xu; Ian Oakley
Tangible interaction links digital data with physical forms to support embodied use. Puzzle pieces, which their inherent physical syntax of connectable elements, provide a powerful and expressive metaphor on which to construct such tangible systems. Prior work has explored this potential in the domain of edutainment systems for children aimed at tasks such as learning logic, programming or organizational skills. Although this work is promising, it has largely focused on relatively advanced concepts and children of ages 7-12 years. The work presented in this paper adopts the same perspective but focuses on young children (5 and under) and a simpler range of concepts relating to the clustering and manipulation of data. To achieve this it presents the design (including results from a series of six formative field studies) and implementation of the Digital Dream Lab tabletop puzzle block system. This system, intended for installation in a museum, engages young children (aged 4-5) to explore simple programmatic concepts and the link between the physical and virtual world. The paper closes with design recommendations of future work targeting this goal, setting and age group.
FireFlies: physical peripheral interaction design for the everyday routine of primary school teachers BIBAFull-Text 57-64
  Saskia Bakker; Elise van den Hoven; Berry Eggen
This paper presents a research-through-design study into interactive systems for a primary school setting to support teachers' everyday tasks. We developed an open-ended interactive system called FireFlies, which is intended to be interacted with in the periphery of the teacher's attention and thereby become an integral part of everyday routines. FireFlies uses light-objects and audio as a (background) information display. Furthermore, teachers can manipulate the light and audio through physical interaction. A working prototype of FireFlies was deployed in four different classrooms for six weeks. Qualitative results reveal that all teachers found a relevant way of working with FireFlies, which they applied every day of the evaluation. After the study had ended and the systems were removed from the schools, the teachers kept reaching for the devices and mentioned they missed FireFlies, which shows that it had become part of their everyday routine.
Comparing motor-cognitive strategies for spatial problem solving with tangible and multi-touch interfaces BIBAFull-Text 65-72
  Alissa N. Antle; Sijie Wang
We present the results from a mixed methods comparison of a tangible and a multi-touch interface for a spatial problem solving task. We applied a modified version of a previous framework to code video of hand-based events. This enabled us to investigate motor-cognitive strategies as well as traditional performance and preference constructs. Sixteen adult participants completed jigsaw puzzles using both interfaces. Our results suggest that the 3D manipulation space, eyes-free tactile feedback, and the offline workspace afforded by the tangible interface enabled more efficient and effective motor-cognitive strategies. We discuss the implications of these findings for interface design; including suggestions for spatial and visual structures that may support epistemic strategies, and hybrid interfaces where tangible handles may be used as structural anchors as well as controls and representational objects.

Material world

A design space for ephemeral user interfaces BIBAFull-Text 75-82
  Tanja Döring; Axel Sylvester; Albrecht Schmidt
In this paper, we present the novel concept of ephemeral user interfaces. Ephemeral user interfaces contain at least one user interface (UI) element that is intentionally created to last for a limited time only and typically incorporate materials that evoke a rich and multisensory perception, such as water, fire, soap bubbles or plants. We characterize the term "ephemeral user interface" and, based on a review of existing user interfaces that fall into this research area but have not been discussed under one common term before, we present a design space for ephemeral user interfaces providing a terminology for (a) materials for ephemeral UI elements, (b) interaction and (c) aspects of ephemerality. This paper contributes to the ongoing research on materiality of user interfaces as well as on conceptualizing visionary interaction styles with novel materials.
Microcontrollers as material: crafting circuits with paper, conductive ink, electronic components, and an "untoolkit" BIBAFull-Text 83-90
  David A. Mellis; Sam Jacoby; Leah Buechley; Hannah Perner-Wilson; Jie Qi
Embedded programming is typically made accessible through modular electronics toolkits. In this paper, we explore an alternative approach, combining microcontrollers with craft materials and processes as a means of bringing new groups of people and skills to technology production. We have developed simple and robust techniques for drawing circuits with conductive ink on paper, enabling off-the-shelf electronic components to be embedded directly into interactive artifacts. We have also developed an set of hardware and software tools -- an instance of what we call an "untoolkit" -- to provide an accessible toolchain for the programming of microcontrollers. We evaluated our techniques in a number of workshops, one of which is detailed in the paper. Four broader themes emerge: accessibility and appeal, the integration of craft and technology, microcontrollers vs. electronic toolkits, and the relationship between programming and physical artifacts. We also expand more generally on the idea of an untoolkit, offering a definition and some design principles, as well as suggest potential areas of future research.
Empowering materiality: inspiring the design of tangible interactions BIBAFull-Text 91-98
  Magdalena Schmid; Sonja Rümelin; Hendrik Richter
Tangible user interfaces utilize our ability to interact with everyday objects in order to manipulate virtual data. Designers and engineers usually follow the rule "form follows function", they support an existing interaction with a purpose-built interface. Still, we do not fully exploit the expressiveness of forms, materials and shapes of the nondigital objects we interact with. Therefore, we propose to invert the design process: we empower materiality to inspire the implementation of tangible interactions. Glass objects were chosen as an example of culturally and structurally rich objects: in a three-month workshop, these glass objects were transformed into interactive artefacts. In the paper, we present three resulting contributions: First, we describe our inverted design process as a tool for the stimulation of multidisciplinary development. Second, we derive a list of material-induced interactions. Third, we suggest form-related interactions as a means of designing future tangible interfaces.
Actuating mood: design of the textile mirror BIBAFull-Text 99-106
  Felecia Davis; Asta Roseway; Erin Carroll; Mary Czerwinski
In his 1962 short story, "1000 Dreams of Bellavista," sci-fi author J. G Ballard describes a future in which "psychotropic" homes exist and are designed to "feel and react" to the emotions of their occupants [1]. Today with the rise of affective computing, and advancements in e-textiles, smart materials and sensor technologies, we must consider the ramifications of technology that could actively mirror, alter and transform our feelings through the materials that make up our buildings and environments. This work provides discussion and insight around the binding of material and sensor technologies with affect. We investigate how emotions could be mapped to our environment through textiles. We present two online surveys designed to enable people to map emotions to textiles. We then use the results of these surveys to inform and inspire the design of the Textile Mirror prototype, a 60x92 cm wall panel that is composed of industrial felt interlaced with Nitinol wire, and is designed to shift its textural structure upon receiving emotional signals from its viewer.

Cultural perspectives

From big data to insights: opportunities and challenges for TEI in genomics BIBAFull-Text 109-116
  Orit Shaer; Ali Mazalek; Brygg Ullmer; Miriam Konkel
The combination of advanced genomic technologies and computational tools enables researchers to conduct large-scale experiments that answer biological questions in unprecedented ways. However, interaction tools in this area currently remain immature. We propose that tangible, embedded, and embodied interaction (TEI) offers unique opportunities for enhancing discovery and learning in genomics. Also, designing for problems in genomics can help move forward the theory and practice of TEI. We present challenges and key questions for TEI research in genomics, lessons learned from three case studies, and potential areas of focus for TEI research and design.
The role of cultural forms in tangible interaction design BIBAFull-Text 117-124
  Michael S. Horn
I suggest an approach to tangible interaction design that builds on social and cultural foundations. Specifically, I propose that designers can evoke cultural forms as a means to tap into users' existing cognitive, physical, and emotional resources. The emphasis is less on improving the usability of an interface and more on improving the overall experience around an interactive artifact by cueing productive patterns of social activity. My use of the term cultural form is derived from the work of Geoffrey Saxe and his form-function shift framework. This framework describes a process through which individuals appropriate cultural forms and restructure them to serve new functions in light of shifting goals and expectations. I describe Saxe's framework and then illustrate the use of cultural forms in design with three examples.
Magical realities in interaction design BIBAFull-Text 125-128
  Majken Kirkegaard Rasmussen
The field of interaction design is littered with examples of artefacts, which seemingly do not adhere to well-known physical causalities and our innate expectations of how artefacts should behave in the world, thereby creating the impression of a magic reality; where things can float in mid-air, the usually inanimate TV can become animate, two separate objects can become physically connected, and we can move objects with our mind. The paper presents Subbotsky's [21] four types of magical causalities: mind-over-matter magic, animation magic, nonpermanence magic and sympathetic magic, as a way to reflect upon the magical realities constructed by technological artefacts.
Embodiment: auditory visual enhancement of interactive environments BIBAFull-Text 129-136
  Richard Salmon; Garth Paine
This paper reflects upon two experimental projects, implemented as auditory visual (AV) augmentation of the Articulated Head (AH) [9] which is an interactive robotic installation exhibited in the Power House Museum (PHM) [25], Ultimo, Sydney, Australia. Research participant Video Cued Recall (VCR) interviews and subsequent Interpretative Phenomenological Analysis (IPA) [28] indicate that the experimental projects did have some impact upon audience engagement. We discuss mediating considerations and constraints, which explicate confounding and compromising aspects of the experimental designs and their presentation to the interacting audience.
   Within the context of embodiment, the critical importance that dimensional layout and display have upon the effectiveness of audio visual aids and the strength of spatio-temporal contextualizing cues in relatively unconstrained interactive public exhibition spaces is considered.
   Conclusions contribute a refined experimental project design, aimed at expediting more encouraging participant reportage of the enhancement of engagement in Human Computer Interaction (HCI) with this, and similar types of interactive installation exhibits.

Compare and contrast

On interface closeness and problem solving BIBAFull-Text 139-146
  Thomas J. Donahue; G. Michael Poor; Martez E. Mott; Laura Marie Leventhal; Guy Zimmerman; Dale Klopfer
Prior research suggests that "closer" interface styles, such as touch and tangible, would yield poorer performance on problem solving tasks as a result of their more natural interaction style. However, virtually no empirical investigations have been conducted to test this assumption. In this paper we describe an empirical study, comparing three interfaces, varying in closeness (mouse, touchscreen, and tangible) on a novel abstract problem solving task. We found that the tangible interface was significantly slower than both the mouse and touch interfaces. However, the touch and tangible interfaces were significantly more efficient than the mouse interface in problem solving across a number of measures. Overall, we found that the touch interface condition offered the best combination of speed and efficiency; in general, the closer interfaces offer significant benefit over the traditional mouse interface on abstract problem solving.
Supporting offline activities on interactive surfaces BIBAFull-Text 147-154
  Augusto Esteves; Michelle Scott; Ian Oakley
This paper argues that inherent support for offline activities -- activities that are not sensed by the system -- is one of strongest benefits of tangible interaction over more traditional interface paradigms. By conducting two studies with single and paired users on a simple tangible tabletop scheduling application, this paper explores how tabletop interfaces could be designed to better support such offline activities. To focus its exploration, it looks at offline activities in terms of how they support cognitive work, such as aiding exploration of problem spaces or lowering task complexity. This paper concludes with insights relating to the form, size, and location for spaces that afford offline actions, and also the design of tangible tokens themselves.
Reach across the boundary: evidence of physical tool appropriation following virtual practice BIBAFull-Text 155-158
  Ali Mazalek; Timothy N. Welsh; Michael Nitsche; Connor Reid; Paul Clifton; Fred Leighton; Kai Tan
Our research explores the connection between physical and virtual tools. This work is based on research from the cognitive sciences showing that physical and virtual tool use extends our brain's representation of peripersonal space to include the tool. These findings led us to investigate if tool appropriation transfers from virtual to physical tools. The present paper reports the results of a study that revealed that manipulating a tool in the virtual space is sufficient to induce appropriation of a similar physical tool. These results have implications for interaction design in training and simulation applications.
Drag and drop the apple: the semantic weight of words and images in touch-based interaction BIBAFull-Text 159-166
  Ilhan Aslan; Martin Murer; Verena Fuchsberger; Andrew Fugard; Manfred Tscheligi
In this paper we report a user study to investigate the effect of semantic weight in a touch-based drag and drop task. The study was motivated by our own interest in exploring potential factors that influence touch behavior and supported by results in related neuroscience research. The question we intended to answer is: "Do people drag the representation of a smaller and lighter real world object (e.g. an apple) different than the representation of a heavier and larger real world object (e.g. a car)?". Participants were asked to perform a drag and drop task repeatedly on a tablet device. Dragged objects were the same physical size on screen, but represented real world objects that were either heavy and large or light and small. We studied two representation modalities (i.e. image and text). In both representation modalities, semantically heavier objects were dragged significantly faster than semantically lighter objects.
Physical games or digital games?: comparing support for mental projection in tangible and virtual representations of a problem-solving task BIBAFull-Text 167-174
  Augusto Esteves; Elise van den Hoven; Ian Oakley
This paper explores how different interfaces to a problem-solving task affect how users perform it. Specifically, it focuses on a customized version of the game of Four-in-a-row and compares play on a physical, tangible game board with that conducted in mouse and touch-screen driven virtual versions. This is achieved through a repeated measures study involving a total of 36 participants and which explicitly assesses aspects of cognitive work through measures of time task, subjective workload, the projection of mental constructs onto external structures and the occurrence of explanatory epistemic actions. The results highlight the relevance of projection and epistemic action to this problem-solving task and suggest that the different interface forms afford instantiation of these activities in different ways. The tangible version of the system supports the most rapid execution of these actions and future work on this topic should explore the unique advantages of tangible interfaces in supporting epistemic actions.

Getting mobile

Unifone: designing for auxiliary finger input in one-handed mobile interactions BIBAFull-Text 177-184
  David Holman; Andreas Hollatz; Amartya Banerjee; Roel Vertegaal
We present Unifone, a prototype mobile device that explores the use of auxiliary finger input in one-handed mobile interaction. Using force-distributed pressure sensing along the side of device, we examine how squeeze-based gestures impact four common mobile interactions: scrolling, map navigating, text formatting, and application switching. We evaluated the use of Unifone in these tasks using one-handed interactions by the non-dominant hand. Our user study shows that one-handed isometric gestures perform best when they augment rather than restrict or alter the primary pointing action of the thumb and, generally, are suitable for coarse isometric pressure input.
Tickle: a surface-independent interaction technique for grasp interfaces BIBAFull-Text 185-192
  Katrin Wolf; Robert Schleicher; Sven Kratz; Michael Rohs
We present a wearable interface that consists of motion sensors. As the interface can be worn on the user's fingers (as a ring) or fixed to it (with nail polish), the device controlled by finger gestures can be any generic object, provided they have an interface for receiving the sensor's signal. We implemented four gestures: tap, release, swipe, and pitch, all of which can be executed with a finger of the hand holding the device. In a user study we tested gesture appropriateness for the index finger at the back of a handheld tablet that offered three different form factors on its rear: flat, convex, and concave (undercut). For all three shapes, the gesture performance was equally good, however pitch performed better on all surfaces than swipe. The proposed interface is an example towards the idea of ubiquitous computing and the vision of seamless interactions with grasped objects. As an initial application scenario we implemented a camera control that allows the brightness to be configured using our tested gestures on a common SLR device.
FlexView: an evaluation of depth navigation on deformable mobile devices BIBAFull-Text 193-200
  Jesse Burstyn; Amartya Banerjee; Roel Vertegaal
We present FlexView, a set of interaction techniques for Z-axis navigation on touch-enabled flexible mobile devices. FlexView augments touch input with bend to navigate through depth-arranged content. To investigate Z-axis navigation with FlexView, we measured document paging efficiency using touch, against two forms of bend input: bending the side of the display (leafing) and squeezing the display (squeezing). In addition to moving through the Z-axis, the second experiment added X-Y navigation in a pan-and-zoom task. Pinch gestures were compared to squeezing and leafing for zoom operations, while panning was consistently performed using touch. Our experiments demonstrate that bend interaction is comparable to touch input for navigation through stacked content. Squeezing to zoom recorded the fastest times in the pan-and-zoom task. Overall, FlexView allows users to easily browse depth arranged information spaces without sacrificing traditional touch interactions.
cubble: a multi-device hybrid approach supporting communication in long-distance relationships BIBAFull-Text 201-204
  Robert Kowalski; Sebastian Loehmann; Doris Hausen
Couples in long-distance relationships (LDR) want to keep in touch, share emotions and feel connected despite the geographical distance. Current approaches to solve this problem include dedicated objects, common communication channels and mobile applications (apps). To combine the advantages of all three approaches, this paper introduces a hybrid approach called cubble. cubble enables partners to share their emotions, simple messages and remote presence. The prototype offers color signals augmented with vibration patterns and thermal feedback. We performed qualitative user explorations, which show that users favor the hybrid communication concept and found that this fostered their intimate communication by providing emotional closeness.

Gesture & toolkits

GestureAgents: an agent-based framework for concurrent multi-task multi-user interaction BIBAFull-Text 207-214
  Carles F. Julià; Nicolas Earnshaw; Sergi Jordà
While the HCI community has been putting a lot of effort on creating physical interfaces for collaboration, studying multi-user interaction dynamics and creating specific applications to support (and test) this kind of phenomena, it has not addressed the problems implied in having multiple applications sharing the same interactive space. Having an ecology of rich interactive programs sharing the same interfaces poses questions on how to deal with interaction ambiguity in a cross-application way and still allow different programmers the freedom to program rich unconstrained interaction experiences.
   This paper describes GestureAgents, a framework demonstrating several techniques that can be used to coordinate different applications in order to have concurrent multi-user multi-tasking interaction and still dealing with gesture ambiguity across multiple applications.
Enclosed: a component-centric interface for designing prototype enclosures BIBAFull-Text 215-218
  Christian Weichel; Manfred Lau; Hans Gellersen
This paper explores the problem of designing enclosures (or physical cases) that are needed for prototyping electronic devices. We present a novel interface that uses electronic components as handles for designing the 3D shape of the enclosure. We use the .NET Gadgeteer platform as a case study of this problem, and implemented a proof-of-concept system for designing enclosures for Gadgeteer components. We show examples of enclosures designed and fabricated with our system.
Demonstration-based vibrotactile pattern authoring BIBAFull-Text 219-222
  Kyungpyo Hong; Jaebong Lee; Seungmoon Choi
We propose a vibrotactile pattern authoring method based on the user's demonstration, where a device generates a vibrotactile pattern by imitating the touch input pattern of the user. The usability of our demonstration-based method is also compared with the conventional waveform-based authoring. The results showed that our demonstration-based authoring delivered comparable or superior performance to the waveform-based method, depending on the specific design tasks.
exTouch: spatially-aware embodied manipulation of actuated objects mediated by augmented reality BIBAFull-Text 223-228
  Shunichi Kasahara; Ryuma Niiyama; Valentin Heun; Hiroshi Ishii
As domestic robots and smart appliances become increasingly common, they require a simple, universal interface to control their motion. Such an interface must support a simple selection of a connected device, highlight its capabilities and allow for an intuitive manipulation. We propose "exTouch", an embodied spatially-aware approach to touch and control devices through an augmented reality mediated mobile interface. The "exTouch" system extends the users touchscreen interactions into the real world by enabling spatial control over the actuated object. When users touch a device shown in live video on the screen, they can change its position and orientation through multi-touch gestures or by physically moving the screen in relation to the controlled object. We demonstrate that the system can be used for applications such as an omnidirectional vehicle, a drone, and moving furniture for reconfigurable room.

Specific user groups

Designing interactive content with blind users for a perceptual supplementation system BIBAFull-Text 229-236
  Matthieu Tixier; Charles Lenay; Gabrielle Le Bihan; Olivier Gapenne; Dominique Aubert
With the spread of ICT and the Internet during the last two decades, more and more tools rely on graphical interfaces that make them scarcely accessible to the visually impaired. The ITOIP project aims at developing the use of a tactile perceptual supplementation system called Tactos. A partnership with a visually impaired persons (VIP) association allows us to conduct a participatory design approach intended to gather first community of users around our system. This article reports on the design approach we have implemented in order to develop usable and useful applications for VIP users. Through a rapid prototyping process we address the development of the use of our technology with blind users representatives. We present the interaction and use principles highlighted from the design of three Tactos applications: a tutorial, a street maps exploration system and a country level map application.
Wrapping up LinguaBytes, for now BIBAFull-Text 237-244
  Bart Hengeveld; Caroline Hummels; Hans van Balkom; Riny Voort; Jan de Moor
In this paper we present the final research prototype of LinguaBytes, a tangible interface aimed at stimulating the language development of non- or hardly speaking children between 1 and 4 years old. LinguaBytes was developed in a three-year Research through Design process in which five incremental prototypes were designed, built and evaluated in real-life settings. In this paper we present the original starting points of the project, describe our method and illustrate the resulting end-design using example scenarios of use. We give an overview of the most significant findings at the ten-month evaluation moment, after which we reflect on the original starting points and assess whether they hold up.
Touch-screens are not tangible: fusing tangible interaction with touch glass in readers for the blind BIBAFull-Text 245-253
  Yasmine N. El-Glaly; Francis Quek; Tonya Smith-Jackson; Gurjot Dhillon
In this paper we introduce the idea of making touch surfaces of mobile devices (e.g. touch phones and tablets) truly tangible for Individuals with Blindness or Severe Visual Impairment (IBSVI). We investigate how to enable IBSVI to fuse tangible landmark patterns with layout of page and location of lexical elements -- words, phrases, and sentences. We designed a tactile overlay that gives tangible feedback to IBSVI when using touch devices for reading. The overlay was tested in a usability study, and the results showed the role of tangibility in leveraging accessibility of touch devices and supporting reading for IBSVI.


Designing tangible magnetic appcessories BIBAFull-Text 255-258
  Andrea Bianchi; Ian Oakley
Tangible interaction allows the control of digital information through physical artifacts -- virtual data is tied to real-world objects. Sensing and display technologies that enable this kind of functionality are typically complex. This represents a barrier to entry for researchers and also restricts where these interaction techniques can be deployed. Addressing these limitations, recent work has explored how the touch screens on mobile devices can be used as sensing and display platforms for tangible interfaces. This paper extends this work by exploring how magnets can be employed to achieve similar ends. To achieve this, it describes the design and construction of eight magnetic appcessories. These are cheap, robust physical interfaces that leverage magnets (and the magnetic sensing built into mobile devices) to support reliable and expressive tangible interactions with digital content.
Don't open that door: designing gestural interactions for interactive narratives BIBAFull-Text 259-266
  Paul Clifton; Jared Caldwell; Isaac Kulka; Riccardo Fassone; Jonathan Cutrell; Kevin Terraciano; Janet Murray; Ali Mazalek
"Don't Open That Door" is a gesture-based interactive narrative project set in the universe of the TV show Supernatural. The project leverages expectations of the horror genre and fan knowledge of the show to elicit expressive interactions and provide satisfying dramatic responses within a seamless scenario in order to create dramatic agency for the interactor. We use verbal, audiovisual, reactive, and mimetic techniques to script the interactor. From our research, design process, and user observations, we gain insight in to designing for dramatic agency and managing user expectations in gesture-based interactive systems.
Animate mobiles: proxemically reactive posture actuation as a means of relational interaction with mobile phones BIBAFull-Text 267-270
  Fabian Hemmert; Matthias Löwe; Anne Wohlauf; Gesche Joost
In this paper, we explore body language in mobile phones as a means of relational interaction. We describe a prototype that allows the simulation of proxemic reactions to the nearing hand of a user, ranging from affection to aversion, based on nearness-based input and shape change-based output. A user study is reported, which indicates that users were able to interpret the prototype's behavior drawing on animal parallels. It is concluded that proxemically reactive actuation may be a viable means of actively integrating the relationship between the user and the device into the interaction.
Exploring physical prototyping techniques for functional devices using .NET gadgeteer BIBAFull-Text 271-274
  Steve Hodges; Stuart Taylor; Nicolas Villar; James Scott; John Helmes
In this paper we present a number of different physical construction techniques for prototyping functional electronic devices. Some of these approaches are already well established whilst others are more novel; our aim is to briefly summarize some of the main categories and to illustrate them with real examples. Whilst a number of different tools exist for building working device prototypes, for consistency the examples we present here are all built using the Microsoft .NET Gadgeteer platform. Although this naturally constrains the scope of this study, it also facilitates a basic comparison of the different techniques. Our ultimate aim is to enable others in the field to learn from our experiences and the techniques we present.
In touch with space: embodying live data for tangible interaction BIBAFull-Text 275-278
  Trevor Hogan; Eva Hornecker
This paper introduces two devices; H3 ('Hydrogen cubed') and a Solar Radiation Dowsing Rod, both crossmodal data-driven artefacts that represent live data streams using haptic-auditory feedback. The motivation for creating these artefacts was to offer casual users the opportunity to interact with data that would normally only be explored by experts, with the aim of stimulating curiosity, intrigue and awareness. In addition to the description of the devices, we discuss the concept behind their design and initial observations from a user study.
C4: a creative-coding API for media, interaction and animation BIBAFull-Text 279-286
  Travis Kirton; Sebastien Boring; Dominikus Baur; Lindsay MacDonald; Sheelagh Carpendale
Although there has been widespread proliferation of creative-coding programming languages, the design of many toolkits and application programming interfaces (APIs) for expression and interactivity do not take full advantages of the unique space of mobile multitouch devices. In designing a new API for this space we first consider five major problem spaces and present an architecture that attempts to address these to move beyond the low-level manipulation of graphics giving first-class status to media objects.
   We present the architecture and design of a new API, called C4, that takes advantage of Objective-C, a powerful yet more complicated lower-level language, while remaining simple and easy to use. We have also designed this API in such a way that the software applications that can be produced are efficient and light on system resources, culminating in a prototyping language suited for the rapid development of expressive mobile applications. The API clearly presents designs for a set of objects that are tightly integrated with multitouch capabilities of hardware devices. C4 allows the programmer to work with media as first-class objects; it also provides techniques for easily integrating touch and gestural interaction, as well as rich animations, into expressive interfaces.
   To illustrate C4 we present simple concrete examples of the API, a comparison of alternative implementation options, performance benchmarks, and two interactive artworks developed by independent artists. We also discuss observations of C4 as it was used during workshops and an extended 4-week residency.
SpectroFlexia: interactive stained glass as a flexible peripheral information display BIBAFull-Text 287-290
  Attalan Mailvaganam; Saskia Bakker
In this paper we present SpectroFlexia, a form of interactive stained glass that is designed to present information in the periphery of people's attention. SpectroFlexia is developed in an iterative design process which revealed a low-cost method of smoothly changing the color of light shining through translucent materials. Using this method, SpectroFlexia can display several types of digital information through the speed at which its colors change. In addition to providing peripheral information, SpectroFlexia is designed to serve a decorative function. An informal user exploration with an interactive prototype of SpectroFlexia indicated that smooth color transitions are a promising way of presenting peripheral information.
ToyVision: a toolkit to support the creation of innovative board-games with tangible interaction BIBAFull-Text 291-298
  Javier Marco; Sandra Baldassarri; Eva Cerezo
This paper presents "ToyVision": a software toolkit developed to facilitate the implementation of tangible games in visual-based tabletop devices. Compared to other toolkits for tabletops which offer very limited and tag-centered tangible possibilities, ToyVision provides designers with tools for modeling and implementing innovative tangible playing pieces with a high level of abstraction from the hardware. For this purpose, a new abstraction layer (the Widget layer) has been included in an already existing tabletop framework (ReacTIVision), providing the host application with high processed data about each playing piece involved in the game. To support the framework application, a Graphic Assistant tool enables the designer to visually model all the playing pieces into tangible tokens that can be tracked and controlled by the framework software. As a practical example, the complete process of prototyping a tangible game is described.
LOLLio: exploring taste as playful modality BIBAFull-Text 299-302
  Martin Murer; Ilhan Aslan; Manfred Tscheligi
In this paper we describe an exploratory design study of the potentials of taste as a playful interaction modality. We present the design and implementation of LOLLio -- an interactive lollipop that serves as a haptic input device that dynamically changes its taste. We conclude this paper with three basic principles for potential game designs, where we see how the interactive lollipop we have built can foster novel, playful game experiences.
Hybrid interface design for distinct creative practices in real-time 3D filmmaking BIBAFull-Text 303-306
  Michael Nitsche; Friedrich Kirschner
TUIs have become part of a larger digital ecology. Thus, hybrid interface design that combines tangible with other interaction options is ever more important. This paper argues for the importance of such hybrid approaches for creative practices that use divergent collaborative processes. It presents the design, evolution, and implementation of such a hybrid interface for machinima film production. Finally, it provides initial reflection on the use and preliminary evaluation of the current system.
Display blocks: a set of cubic displays for tangible, multi-perspective data exploration BIBAFull-Text 307-314
  Pol Pla; Pattie Maes
This paper details the design and implementation of a new type of display technology. Display Blocks are a response to two major limitations of current displays: dimensional compression and physical-digital disconnect. Each Display Block consists of six organic light emitting diode (OLED) screens, arranged in a cubic form factor. We explore the possibilities that this type of display holds for data visualization, manipulation and exploration. To this end, we accompany our design with a set of initial applications that leverage the form factor of the displays. We hope that this work shows the promise of display technologies which use their form factor as a cue to understanding their content.
LiquiTouch: liquid as a medium for versatile tactile feedback on touch surfaces BIBAFull-Text 315-318
  Hendrik Richter; Felix Manke; Moriel Seror
On interactive surfaces such as touch screens, tabletops or interactive walls, the addition of active tactile feedback has been shown to greatly improve user performance and subjective evaluation of the interaction. However, common electromechanical solutions to enable tactile stimuli on flat touch displays entail the use of costly, complex and cumbersome actuator technology. This especially holds true for solutions which try to address the full complexity of our sense of touch, i.e. our ability to experience warmth, coolness, pressure, stickiness or smoothness. In this paper, we propose the use of liquid as a medium for versatile tactile feedback.
   We present LiquiTouch, a first prototype which emits actively generated water jets in order to communicate state, function and material properties of virtual touchscreen elements. We discuss the design implications and illustrate the potentials of using liquid to enrich and improve the interaction with touch surfaces.
Wo.Defy: wearable interaction design inspired by a Chinese 19th century suffragette movement BIBAFull-Text 319-322
  Thecla Schiphorst; Wynnie (Wing Yi) Chung; Emily Ip
This paper describes the design process of Wo.Defy, an interactive wearable kinetic garment inspired by the Self-Combing Sisters, a group of suffragette North Cantonese Chinese women of the late 19th and early 20th century, who challenged the traditional marital status of women through their choice of hair-styling and dress. The design and construction of the Wo.Defy interactive garment incorporates cultural and material references used by the Self-Combing Sisters. The garment responds to the wearer's physiological breathing patterns through physical kinetic movements in the form of motorized contracting floral doilies. Silk fibers and human hair are integrated into the garment as organic materials referencing personal and social memory. Wo.Defy contributes to the design discourse of Tangible Embodied Interaction by integrating cultural historical research into contemporary wearable design practice.
Radical clashes: what tangible interaction is made of BIBAFull-Text 323-326
  Jelle van Dijk; Camille Moussette; Stoffel Kuenen; Caroline Hummels
Driven by a critique of Ishii et al's recent vision of Radical Atoms we call for a debate on the different conceptual paradigms underlying the TEI community and its activities. TEI was initiated to share and connect different perspectives, but we feel conceptual debate is lacking. To fuel this debate, we start with comparing two paradigms by examining the Radical Atoms proposal and balance it from our design-led perspective. Our aim with this paper is to revive the richness of TEI's multidisciplinary approach.
Volumetric linear gradient: methods for and applications of a simple volumetric display BIBAFull-Text 327-330
  Daniel Wessolek
We present Volumetric Linear Gradient, a simple volumetric display. The display consists of a fiber-optic cylinder with a diffused surface and a light emitting diode (LED) on each end. Each LED emits a different color and the light is mixed within the light guide. It is possible to proportionally modify the two light colors within the medium through pulse-width modulation of the two LEDs. The result is similar to moving the stop points of a gradient in graphic design software. This volumetric display can be used as an information display, giving ambient clues about temporal or quantitative data. The display can take different shapes and can be composed of several of these elements in order to create a more complex display. Possible application uses are volumetric progress bars, energy indicators or, in combination with sensors, simple gravitational fluid metaphors.
HideOut: mobile projector interaction with tangible objects and surfaces BIBAFull-Text 331-338
  Karl D. D. Willis; Takaaki Shiratori; Moshe Mahler
HideOut is a mobile projector-based system that enables new applications and interaction techniques with tangible objects and surfaces. HideOut uses a device mounted camera to detect hidden markers applied with infrared-absorbing ink. The obtrusive appearance of fiducial markers is avoided and the hidden marker surface doubles as a functional projection surface. We present example applications that demonstrate a wide range of interaction scenarios, including media navigation tools, interactive storytelling applications, and mobile games. We explore the design space enabled by the HideOut system and describe the hidden marker prototyping process. HideOut brings tangible objects to life for interaction with the physical world around us.

Graduate student consortium

Eco-buzz: an interactive eco-feedback system based on cultural forms of play BIBAFull-Text 341-342
  Amartya Banerjee
I present the design of Eco-Buzz, an interactive system designed to engage families in informal learning activities in which they seek out hidden sources of energy consumption in their homes. The system combines an electro-magnetic field (EMF) detector with a mobile tablet computer. Bringing the Eco-Buzz device within range of an electrical current activates the detector; the output from the detector is used to provide feedback during the activity.
Spatial illusions on tangibles BIBAFull-Text 343-344
  Brian Eschrich
In this Paper I present a system which uses image projection to create illusions of spatial geometry on tangible objects. It indicates the starting point of my research, meaning the description of the system and the realized prototype. In addition to a discussion of how to project a spatial illusion onto the surface of a tangible object, I also present a prototype setup, highlighting the problems and solutions of user and object tracking.
Designing tangible/free-form applications for navigation in audio/visual collections (by content-based similarity) BIBAFull-Text 345-346
  Christian Frisson
This paper focuses on one aspect of doctoral studies, within the last year of completion, consisting in designing applications for the navigation (by content-based similarity) in audio or video collections: the choice of tangible or free-form interfaces depending on use cases. One goal of this work is to determine which type of gestural interface suits best each chosen use case making use of navigation into media collections composed of audio or video elements, among: classifying sounds for electroacoustic music composition, derushing video, improvising instant music through an installation organizing and synchronizing audio loops. Prototype applications have been developed using the modular Media-Cycle framework for organization of media content by similarity. We conclude preliminarily that tangible interfaces are better-suited for focused expert tasks and free-form interfaces for multiple-user exploratory tasks, while a combination of both can create emergent practices.
RemoteBunnies: multi-agent phenomena mapping between physical environments BIBAFull-Text 347-348
  Paulo Guerra
This paper presents RemoteBunnies, a proof-of-concept design for multi-agent phenomena mapping between physical environments. RemoteBunnies relies on a sensor network and a tangible representational tool connected via an Extensible Messaging and Presence Protocol (XMPP) server. For demonstration purposes, the proposed system will be integrated to map field data of cottontail rabbits' foraging behavior onto an indoor environment.
Citizen drones: embedded crafts for remote sensing BIBAFull-Text 349-350
  Sibel Deren Guler
Many recent projects in the maker movement have implemented environmental monitoring agents. These projects tend to focus on projecting technological advances rather than adapting monitoring agents to be compatible with common tools. I have investigated methods of low-tech citizen science and DIY sensing as a means to provide a simple, effective way for emerging economies and rural communities to share and record local environmental data. This provides a community with the means to open a dialogue and enact changes relevant to environmental issues. Integration into daily life is a key factor in the success of these projects and can be achieved through an informed choice of materials and method of instruction. The outcomes of my research suggest that sensor-based technology is best embraced when embedded into a culturally relevant medium and is introduced through free educational workshops.
Sound actuation and interaction BIBAFull-Text 351-352
  Jiffer Harriman
My work is focussed on the intersection of science and creative arts. I am driven to understand ways computation can be used to create natural, intuitive interactions with the physical world. It is important for designers to understand the role of technology and to only apply that which contributes to achieve an aesthetic, practical or artistic goal so as to not draw attention from what is salient about an interaction.
   This interdisciplinary work requires an understanding of a number of fields. By submersing myself into a community of artists as well as engineers at the University of Colorado at Boulder I am gaining new appreciation into how both view the world.
   An area of focus for me has been sound actuation in musical instrument and installation contexts. This paper will detail some of my recent research projects and current collaborations.
Inspiroscope: understanding participant experience BIBAFull-Text 353-354
  Jiann Hughes
The genre of biosensing interactive art relies on the bodily responses of participants to complete the work. Exploring the experience of participants during such encounters is central to fully appreciating the work of art. Yet few artists and researchers working in this area are actively seeking to understand the experience of participants with these works. This research project has developed Inspiroscope, a biosensing, interactive artwork that evokes an awareness of embodiment by focusing perception towards the act of breathing. It pursues an understanding of the subjective bodily and aesthetic experiences of participants who encounter the work. This paper uses the prototype of Inspiroscope as the test bed to consider some of these experiences. The results acknowledge that an artwork can provide participants with a playful, self-reflexive, exploratory space, giving them with a deeper understanding of their bodies and an enhanced lived experience of being in the world.
Touch at a distance for the remote communication of emotions BIBAFull-Text 355-356
  Gabrielle Le Bihan
Touching is a tangible way to interact and to communicate emotions that becomes unavailable when people are at a distance. In this paper I present Touch Through (TT), a smartphone application which provides an alternative way of experiencing interpersonal touch at a distance. With this system, I am interested in the issue of allowing people to share emotions and have a feeling of presence at a distance.
Do-it-yourself electronic products and the people who make them BIBAFull-Text 357-358
  David A. Mellis
This paper describes my doctoral research into the possibilities that digital fabrication offers for the individual design and production of electronic devices. Research questions relate to the possibilities for the devices themselves and their relationship to people's skills interests. The paper discusses four overall approaches: product design explorations, do-it-yourself activities, custom software tools, and studying communities. Three completed case studies -- a radio, speakers, and a mouse -- provide insight into the design possibilities and their implications for various groups. Future work will explore further aspects of this ecosystem, including more complex devices, more in-depth investigation of people, and community surveys.
Token-based interaction with embedded digital information BIBAFull-Text 359-360
  Simone Mora
Embedding digital information into places and objects can improve collaborative processes by allowing a piece of information to travel across different contexts of use. Yet tools for supporting the processes of information embedding, discovery and visualization are needed. This PhD-work aims at providing a conceptual framework that promote the use of (in)tangible tokens to enable information embeddedness. The framework is used to drive the design of pervasive applications to support collaboration and reflection in crisis management.
SwimBrowser and beyond BIBAFull-Text 361-362
  David Stolarsky
In this paper we describe SwimBrowser, a gestural web browser, and discuss its implications for accessibility, personal fitness, performing arts, and gestural interaction.
Designing a long term study evaluating a physical interface for preschoolers BIBAFull-Text 363-364
  Cristina Sylla
This work presents an ongoing study of the design and development of a physical interface that addresses storytelling. The current prototype is the result of several design iterations with four to five years old preschoolers and six preschool teachers. The interaction model was motivated by findings from research on tangible user interfaces as well as embodied cognition. Although research in these areas has revealed potential benefits of the use of physical interfaces, until now no extended in depth study of a prolonged use in the classroom of such interfaces has been carried. This work proposes to carry such an investigation, observing a group of preschoolers interacting with the interface for a period of six months.
Physical activity and motor coordination in musical freeplay BIBAFull-Text 365-366
  Juan Gabriel Tirado Sandino
Exploring new ways for effectively supporting social interactions in children by using digital technology is an increasing effort among recent human computer interaction designs. A better understanding of how to motivate children to use those artifacts, and how to sustain that motivation over time is fundamental for triggering joint actions, such as spontaneous collaboration and cooperation (important for education and psico-social interventions). A musical playful activity was implemented by using digital technology and the goal was to explore the relationship between physical activity and inter-personal coordination in playing dyads. A group of children played during experimental conditions: the results from a quantitative analysis and a qualitative technique for an ongoing investigation are described.
Augvox: an augmented voice instrument BIBAFull-Text 367-368
  Brian Tuohy
As a musical instrument, the human voice offers a significant range in terms of sonic results. Control of this particular instrument is aided by a natural familiarity, and it takes little concentration to quickly produce sounds of vastly differing musical characteristics. The advent of computer music has afforded manipulative processing that delivers auditory results that would never be attainable in the natural world. This project attempts to bridge the gap between natural generation of vocal sounds, and complex digital processing, by developing a simple, ergonomic instrument that places control of the sonic output in the hands of the artist.
LibreMote: exploring DIY, personal fabrication and community contribution in ad-hoc designed unconventional controllers BIBAFull-Text 369-370
  Fabio Varesano
This paper presents our ongoing work on LibreMote, an Open Hardware Framework composed by an Arduino-based device, a set of software libraries and digital fabrication models. The framework is designed to support researchers, artists and hobbyists in the rapid prototyping of wireless controllers. Our goal is to study whether LibreMote might support innovation in HCI controllers by empowering non technical users to apply DIY and Personal Fabrication approaches to prototype ad-hoc designed unconventional controllers.
Exploring the power of feedback loops in wearables computers BIBAFull-Text 371-372
  Katia Vega
Touch, sight, smell, hearing and taste -- our senses link us to the outside world. Reflexes react to all stimuli arriving simultaneously to our sensory environment. But there are lapses in awareness of seemingly obvious stimuli to temporary losses of attention and lapses that we are not aware of in the form of reflexes. The main motivation of this research is to plug these lapses with the power of feedback loops in environments where human and wearable computers are intertwined and explore their application as tools for self-modification and sustainable change. This work proposes a combination of body worn objects and hidden technology to create compelling, aesthetic solutions that not only appeal to our senses, but which fuse seamlessly with our everyday lives. In order to exemplify this exploration, we created Blinklifier, a wearable device that senses our reflex of blinking through conductive makeup and metalized eyelashes, and amplifies it.
Switching sensory domains: exploring the possibilities of a flickerfon BIBAFull-Text 373-374
  Daniel Wessolek
This paper presents results from experimentation with and subsequent possibilities of using an ambient light sensor connected to an audio amplifier, as well as a derived DIY kit: the Flickerfon. From the perspective of Interaction Design and Media Architecture, this paper focuses on presenting three different scenarios: a 'friendly neighborhood' or silent party scenario, an exhibition scenario, and the use of the Flickerfon as an instrument for exploring visual phenomena where the perception speed of the visual sense is slower than that of the auditory. By sending sound through visual light it becomes possible to mix different sound sources and benefit from the directional properties of light.
Towards wearable aging in place devices BIBAFull-Text 375-376
  Ginger White
This paper summarizes the author's current doctoral work on investigating how technologies can mitigate the stress of caregivers and help older adults successfully age in place. There has been much success and growth in the creation of aging in place technologies. However, few of these technologies focus on the needs and challenges of low socioeconomic status (SES) rural- and urban-dwelling older adults, and fewer, if any, explore the domain of wearable devices. Findings from in situ observations and interviews conducted with low SES rural- and urban-dwelling older adults are presented. These findings are then use to provide insight into the potential development of wearable aging in place devices.
Ubiquitous grasp interfaces BIBAFull-Text 377-378
  Katrin Wolf
Sensory hand augmentation extends the manual function spectrum from controlling analogue objects to digital or smart objects but also might add an interface to any graspable thing and therefore add a digital interface to everyday objects. We propose a finger-attached interface to control grasped objects intended to explore design parameters for always available interfaces. Our device detects finger motions and classifies them according to a set of five gestures. In user studies we found that our gesture classification has a stable performance with respect to different organic-shaped surfaces. Finally, the scalability of our approach towards generic object control will point out its potential.

Arts track

e-maestro: the electronic conductor BIBAFull-Text 381-382
  Rui Avelans Coelho
This project is an attempt to recreate the musical experience of being in front of a symphony orchestra containing more than 80 members. The interaction between the audience and the orchestra is achieved through the development of an interactive control panel enabling users to manipulate groups of musicians, truly putting the viewer in the role of the symphony orchestra conductor.
Hypo Chrysos: mapping in interactive action art using bioacoustic sensing BIBAFull-Text 383-385
  Marco Donnarumma
Hypo Chrysos (HC) is a work of action art for vexed body and biophysical media. During this twenty minutes action I pull two concrete blocks in a circle. My motion is oppressively constant. I have to force myself into accepting the pain until the action is ended. The increasing strain of my corporeal tissues produces continuous bioacoustic signals. These comprise of blood flow, muscle sound bursts, and bone crackles that are amplified, distorted, and played back through eight loudspeakers. The same bioacoustic data stream excites an OpenGL-generated swarm of virtual entities, lights, and organic forms diffused by a video projector. The work brings together different media so as to creatively explore the processes wherein physicality, adaptive biotechnology, and musicianship (or better, the lack of) collide. This article elaborates on the methodology underpinning the piece. It describes, both conceptually and technically, how the performer's physical strain and the resulting music are integrated using bioacoustic sensing.
Transience: aesthetics of dynamic colors inside Japanese calligraphy BIBAFull-Text 387-388
  Kohei Tsuji; Akira Wakita
Transience is the Japanese calligraphy work with dynamic color changes. The scene where the letter colors are changing from moment to moment can give affluent dynamism and feeling of vitality of calligraphy to viewers, and at the same time, it can express stream of time. Calligraphy is integrated with technology and materials seamlessly and Transience is produced to show ever-changing aesthetics fermented in Japan. In order to change letter colors on paper, we developed our original chromogenic mechanism from functional inks and conductive materials. For producing the chromogenic technology suitable for paper, we examined ink materials repeatedly, and as a result we realized the expression where calligraphy harmonizes with computer.
Transparent sculpture: an embodied auditory interface for sound sculpture BIBAFull-Text 389-390
  Daichi Misawa
Toward ecologically distributed interactions of sound in the real world, this paper presents an embodied auditory interface for a sound sculpture; it is composed of orientations' structure of sounds from directional speakers and a pedestal to capture a certain real space.
Fleischwolf an interactive installation BIBAFull-Text 391-392
  Ivan Petkov
The title of the interactive sound installation "Fleischwolf" is based on the German word for meat grinder. It therefore consists of a mechanical kitchen meat grinder mounted on a wooden table. Turning the crank of the grinder causes the machine to emit a sound that initially resembles a very deep bass voice. When the crank is turned more vigorously and faster, the characteristics of the sound changes too. At a certain speed, a baby's scream becomes recognizable, but due to the construction of the meat grinder, it's hard to maintain this sound. So whether this scream is audible at all, depends on the actuation of the installation users.
Movement crafter BIBAFull-Text 393-394
  Larissa Pschetz; Richard Banks; Mike Molloy
The movement crafter attempts to reconcile the pace of new technologies with traditional crafting activities that are performed as pastimes. The project explores concepts of quiet communication and technology hybrids and attempts to support crafting without making the craftsperson overly self-conscious of their practice.
Buildasound BIBAFull-Text 395-396
  Mónica Rikic
Buildasound is a sound building blocks game. It consists in creating shapes at the same time that you generate new sounds: there is no single objective (winning or losing), but instead the entertainment involved in playing and the opportunity to discover new melodies and constructions and constant creation based on the different positions of the blocks.
Cubes BIBAFull-Text 397-398
  Michal Rinott; Shachar Geiger; Eran Gal-Or; Luka Or
Cubes is a collection of 20 objects combing a single input with a single output. Cubes celebrates the potential of simple tangible interactions for engagement and pleasure. By fixing the form of these interactive objects into the simplest one: a cube, we can explore affordances and behavior in a "lab-like" environment. A new aesthetic language is created by making the cubes transparent, equally sized and self-contained.
From wet lab bench to tangible virtual experiment: SynFlo BIBAFull-Text 399-400
  Wendy Xu; Kimberly Chang; Nicole Francisco; Consuelo Valdes; Robert Kincaid; Orit Shaer
SynFlo is an interactive installation that utilizes tangible interaction to make core concepts of synthetic biology accessible to the public. This playful installation allows users to create useful virtual life forms from standardized genetic components through the manipulation of augmented object that mirror scientific instruments in order to explore synthetic biology concepts and protocols. These virtual E. coli, can serve as environmental biosensors that when deployed into an environment represented by a tabletop computer, detect toxins and change their color as an indicator. The goal of this project is to explore ways to develop effective interactive activities for outreach in STEM without the limitations of access to a lab bench while communicating the excitement of cutting-edge research.


TEI 2013 studio description template D.I.Y. interactive painting techniques + electronics BIBAFull-Text 403-406
  Paola Guimerans
This studio will allow participants with non-technical education to learn basic skills and electronics concepts that include combining reactive paints with traditional art techniques. Participants will learn color theory by mixing reactive paints with traditional art techniques, such as watercolor or acrylic painting, as a way to introduce them to fundamental electronics concepts in order to make a basic animation on their painting or illustrations. The goal of the studio is to familiarize participants with new available materials, introduce them to simple circuitry concepts and to learn new practices of artwork creation using electronics.
Tangible embedded Linux BIBAFull-Text 407-410
  Edgar Berdahl; Quim Llimona
During this studio, participants will learn about tangible embedded Linux and how to harness it for building durable, living prototypes. By the end of the studio, each participant will complete a simple tangible prototype using the Satellite CCRMA kit, which is about twice the size of a deck of cards.
   Satellite CCRMA is currently based on the powerful Raspberry PI embedded Linux board, which executes floating-point instructions natively at 700MHz. Participants will be led through running Pure Data (pd) on the board, but participants are welcome to explore other software available on the Satellite CCRMA memory image. Additional topics include Arduino, Firmata, pico projectors, open-source hardware, and more.
C4: creative coding for iOS BIBAFull-Text 411-413
  Travis Kirton
C4 is a new creative coding framework that focuses on interactivity, visualization and the relationship between various media. Designed for iOS, C4 makes it extremely easy to create apps for iPad, iPhone and iPod devices. Initially developed as a platform for quickly creating interactive artistic works, C4 is developing into a more broad-based language for other areas such as music and data visualization.
   In this workshop, participants will rapidly prototype interactive animated interfaces on iOS devices. Participants will have the opportunity to learn how to easily create dynamic animations, using all kinds of media including audio, video, shapes, OpenGL objects and more. In addition to this, participants will learn how to easily add the full suite of iOS gestural interaction to their applications and objects. Furthermore, C4 provides easy access to the camera, as well as access to the compass, motion, acceleration, proximity and light sensors. Along the way, participants will be introduced to the larger C4 community through their participation on various social networks, the Stackoverflow community-moderated Q&A forum, and will also be shown how to access and share code on Github.
Prototyping orientation and motion sensing objects with open hardware BIBAFull-Text 415-418
  Fabio Varesano
This studio aims to introduce participants to orientation and motion sensing based upon inertial, magnetic and atmospheric pressure MEMS sensors. By using two Open Hardware projects we developed, FreeIMU and LibreMote, through hands on experimentation, participants will discover the capabilities and limits of current MEMS accelerometers, gyroscopes, magnetometers and barometers, the basics and state of the art of sensor fusion algorithms and the challenges of calibration.
   In the second part of the studio, participants will experiment the possibilities of such technologies in HCI devices or as part of TUIs by engaging in the development of simple motion sensing capable objects which will be used in simple motion based visual, 3D or musical applications prototypes.
HoneyComb: a platform for computational robotic materials BIBAFull-Text 419-422
  Nikolaus Correll; Nicholas Farrow; Shang Ma
We present the "Honeycomb", a microcontroller platform that can easily be networked into hexagonal lattices of hundreds of nodes to create novel materials that tightly integrate sensing, actuation, computation and communication. The tool-chain consists of the platforms, a viral boot-loader to virally disseminate programs into the network, a software library that facilitates sensing, control, and communication, and software tools that allow interacting with the network from a host computer. After a brief tutorial, participants will have an opportunity to experiment with the Honeycomb hardware, which will be made available during the studio, write code for distributed processing of sensor information, and drive various actuators ranging from multi-color lights to servo motors, with the goal to construct an interactive installation to be displayed at the conference. All materials, including open source hardware and software will be made available on the web prior to the studio.
Designing and making a tangible tabletop game with ToyVision BIBAFull-Text 423-426
  Javier Marco; Ian Oakley; Eva Cerezo; Sandra Baldassarri
Studio participants will design and prototype tangible board-games for NIKVision: a tabletop computer for young children. The goal of this studio is to give designers and developers hands-on experience of developing a functional prototype of a tangible tabletop application without the intrinsic difficulties of managing electronic sensors, actuators and machine vision algorithms. During the studio attendees will complete a simple but conceptually complete tangible board-game prototype during the workshop day by abstracting the technologies and keeping focus firmly on the application behaviours and the interactions between users, objects and the system. This will be achieved through using the ToyVision toolkit, a set of software tools that lowers the threshold of prototyping both the "bits" and "atoms" of interactive tabletop games.
Natural interface exploration BIBAFull-Text 427-430
  Marius Brade; Mandy Keck; Thomas Gründer; Mathias Müller; Rainer Groh
Finding new and compelling approaches to interaction design for natural user interfaces, is challenging. The Natural Interface Exploration studio will offer participants the opportunity to explore interaction design for natural user interfaces based on physical substances that are used in everyday life. Studio organizers will present an overview of their methodology, providing examples of their experience [1, 2] and comparing it to other approaches. They will demonstrate how they analyze natural substances regarding the aspects of visualization and interaction and what kind of interfaces resulted [3, 4, 5] from these findings in initial workshops (see Figure 3 and 4).
   Following the demonstration, participants will form teams and collaboratively decide which substances or materials they would like to analyze. After examining and charting relevant aspects, the teams will chose a certain task to be solved with a new kind of interface. Example tasks will be provided by the studio organizers. The next step will be to decompose the tasks into required interaction and information needs. Finally participants will develop their own interface mock-up using stop motion or paper prototyping. Finally, studio organizers will facilitate a group critique session and offer closing thoughts on employing this methodology in one's creative TEI practice.
TEI 2013 studio: motors and music BIBAFull-Text 431-434
  Bill Verplank; David Gauthier; Jakob Bak
In this studio we present teaching material focused on dynamic force-feedback and sound synthesis for the learning of haptics. Using simple and low-cost tool sets that we have developed specifically for engaging interaction designers with the study of haptics, our aim is to advance the quality of haptics research and experimentation in the classroom. Using this studio opportunity to present what we have developed for design students, we aim to thereby advance the platform through effective learning.
TH_mod_Te studio: making together for sharing whenever BIBAFull-Text 435-438
  Jim Wood
TH_models is a design making toolkit for exploration of peer-to-peer objects using open hardware and lo-fi materials. A range of electronic objects has been designed that demonstrate the fabrication of prototype devices from available modules and tools, at an affordable price, and an accessible entry point. The designs are published online and made available for self-fabrication as well as demonstrated in workshops such as this.
   This studio will give participants from the TEI community a hands-on chance to construct an object of their own from a design template created especially for the studio, and to then build on it designing their own variations or customizing the given model. Lastly and most importantly they will be able to experience how these electronic objects can work together, and form expressions of networks.
Make your own piccolo BIBAFull-Text 439-442
  Greg Saul; Tiago Rorke; Huaishu Peng; Cheng Xu
Piccolo is a miniature Arduino-compatible open source kit for tinkering with basic CNC output. In this studio, we introduce CNC as a concept, guide users to make their own Piccolo, and create artifacts with it. The participants' creations and suggestions will contribute to the development of Piccolo.
Using MTCF for live prototyping on tablet and tangible tabletop devices BIBAFull-Text 443-446
  Daniel Gallardo; Carles F. Julià; Sergi Jordà
Multi-touch technologies have been increasing its popularity in the last decade. Nowadays we can find a plethora of devices that includes this technology: tablets, smartphones or even in desktop computers. In parallel tangible tabletop surfaces started to appear in the global market with devices such as Reactable. What all this devices have in common is its programming cycle often time consuming and hard to prototype.
   We propose another approach to prototyping new applications by using a graphical programming language (Pure data) and a live coding framework specially developed for tangible and multi-touch surfaces and for Android devices.
The fab and the smart city: the use of machines and technology for the city production by its citizens BIBAFull-Text 447-454
  Tomas Diez; Alex Posada
During the last decades, the relationship between technology and people has been continuously changing; the first computers and cnc machines appeared at the middle 50's, personal computers on the 70's, the popularization of Internet on the 90's, and more recently the smartphones (which combine both), produced an evolution on how and for what we are using extended capabilities to relate ourselves with the reality. These new tools are now part of our everyday activities and lives, giving us a vast access to the production (and consumption) of knowledge as never did before, and also the opportunity to share it from/to anywhere, at anytime, and by/for anyone. A simple microcontroller in a control room of the city could affect the lives of thousands of people, a twitted message could modify our mobility patterns, or a broken stoplight can change the time we arrive to our daily activities. It seems that we are more and more dependent on technology, but we might change that. The provision of tools for citizens to reinvent their cities could change the dependency of technology, and furthermore develop a closer relationship between humans and machines, working together for a common purpose.