HCI Bibliography Home | HCI Conferences | MOBILEHCI Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
MOBILEHCI Tables of Contents: 02030405060708091011121314

Proceedings of the 13th Conference on Human-computer interaction with mobile devices and services

Fullname:Proceedings of the 13th Conference on Human-Computer Interaction with Mobile Devices and Services
Editors:Markus Bylund; Oskar Juhlin; Ylva Fernaeus
Location:Stockholm, Sweden
Dates:2011-Aug-30 to 2011-Sep-02
Standard No:ISBN: 1-4503-0541-5, 978-1-4503-0541-9; ACM DL: Table of Contents hcibib: MOBILEHCI11
Links:Conference Home Page
  1. Vibro tactile and beyond
  2. Understanding mobile phone use
  3. Learning
  4. Context awareness
  5. Indoor navigation
  6. Hedonic life
  7. Outdoor navigation
  8. Text and keyboards
  9. Projection and visualizations
  10. Methods and prototyping
  11. Video interaction
  12. Work and security
  13. Interacting off-screen, on-site and remote
  14. Industrial case studies
  15. Posters
  16. Demo
  17. Design competition
  18. Panels
  19. Workshops

Vibro tactile and beyond

Exploring the effects of cumulative contextual cues on interpreting vibrotactile messages BIBAFull-Text 1-10
  Jani Heikkinen; Jussi Rantala; Thomas Olsson; Roope Raisamo; Veikko Surakka
The sense of touch has been shown to convey emotive information and nuances in face-to-face interpersonal communication, but its applications in mobile communication technologies are still limited. One of the challenges for such new communication medium is the interpretation of tactile messages. This paper presents a study with an early prototype of a mobile tactile device. Twenty novice participants interpreted four messages consisting of a four-channel vibrotactile stimulus, complemented with three cumulative textual cues regarding 1) the communication setting, 2) sender, and 3) situation. The subjective interpretations were assessed with four semantic differential scales, and the reasoning behind the interpretations was inquired by interviewing. The findings show that the intensity and, to some degree, the friendliness of the message could be identified from the tactile-only message. However, to correctly interpret the degree of formality or emotionality in the message, also contextual cues were needed.
Tactile effect design and evaluation for virtual buttons on a mobile device touchscreen BIBAFull-Text 11-20
  Gunhyuk Park; Seungmoon Choi; Kyunghun Hwang; Sunwook Kim; Jaecheon Sa; Moonchae Joung
In this paper, we present the design of a large number (72) of tactile stimuli for the confirmation feedback of virtual button presses on the mobile device touchscreen. Two industry standard variable-reluctance actuators (enhanced voice-coil actuators with added mass for stronger output) were used in the study. The design parameters of an input were amplitude, duration, carrier signal, envelope function, and actuator. Eighteen participants evaluated the modeled patterns with the criteria of similarity to physical buttons and user preference. An adjective rating task was also accompanied to assess the subjective quality of the designed tactile effects. Experimental results unveiled several important guidelines for designing realistic and favorable button-click tactile feedback. The findings of this paper have implications for improving the usability of user interface components displayed on a touchscreen by means of haptic feedback.
Intimate mobiles: grasping, kissing and whispering as a means of telecommunication in mobile phones BIBAFull-Text 21-24
  Fabian Hemmert; Ulrike Gollner; Matthias Löwe; Anne Wohlauf; Gesche Joost
In this paper, we explore how direct physical cues of interpersonal nearness can be achieved in mobile phones. Exemplarily, we present three novel means of communication for mobile phones: grasping, kissing and whispering. Reviewing the related work, we point to a research gap in direct physical near-body actuation in mobile telecommunication. To assess this gap, we present three prototypes that implement the proposed novel means of communication. We present initial user comments on the prototypes, which point to acceptance issues. We conclude in a set of research questions for future explorations in this field.
Virtual sensors: rapid prototyping of ubiquitous interaction with a mobile phone and a Kinect BIBAFull-Text 25-28
  Lauren Norrie; Roderick Murray-Smith
The Microsoft Kinect sensor can be combined with a modern mobile phone to rapidly create digitally augmented environments. This can be used either directly as a form of ubiquitous computing environment or indirectly as framework for rapidly prototyping ubicomp environments that are otherwise implemented using conventional sensors. We describe an Android mobile application that supports rapid prototyping of spacial interaction by using 3D position data from the Kinect to simulate a proximity sensor. This allows a developer, or end user, to easily associate content or services on the device with surfaces or regions of a room. The accuracy of the hotspot marking was tested in an experiment where users selected points marked on a whiteboard using a mobile phone. The distribution of the sample points were analysed and showed that the bulk of the selections were within about 13cm of the target and the distributions were characteristically skewed depending on whether the user came to the target from the left or right. This range is sufficient for prototyping many common ubicomp scenarios based on proximity in a room. To illustrate this approach, we describe the design of a novel mobile application that associates a virtual book library with a region of a room, integrating the additional sensors and actuators of a smartphone with the position sensing of the Kinect. We highlight limitations of this approach and suggest areas for future work.
Kick: investigating the use of kick gestures for mobile interactions BIBAFull-Text 29-32
  Teng Han; Jason Alexander; Abhijit Karnik; Pourang Irani; Sriram Subramanian
In this paper we describe the use of kick gestures for interaction with mobile devices. Kicking is a well-studied leg action that can be harnessed in mobile contexts where the hands are busy or too dirty to interact with the phone. In this paper we examine the design space of kicking as an interaction technique through two user studies. The first study investigated how well users were able to control the direction of their kicks. Users were able to aim their kicks best when the movement range is divided into segments of at least 24°. In the second study we looked at the velocity of a kick. We found that the users are able to kick with at least two varying velocities. However, they also often undershoot the target velocity. Finally, we propose some specific applications in which kicks can prove beneficial.
A prototype of gesture-based interface BIBAFull-Text 33-36
  Zhiyuan Lu; Xiang Chen; Zhangyan Zhao; Kongqiao Wang
This paper introduces a novel gesture-based human-machine interface prototype, which consists of a wearable belt embedding with four surface electromyography (SEMG) sensors, a tri-axis accelerometer and an application program running on NOKIA 5800XM. The sensor belt captures hand gestures by acquiring SEMG and acceleration (ACC) signals from forearm, and sends the m out via Bluetooth. The application program receives the data and translates them into control commands of a given interaction application. Experimental results of two test schemes conducted on hand gesture recognition and media player operation demonstrate the validity of the proposed gesture-based interface prototype.

Understanding mobile phone use

Ethnography of the telephone: changing uses of communication technology in village life BIBAFull-Text 37-46
  Tricia Wang; Barry Brown
While mobile HCI has encompassed a range of devices and systems, telephone calls on cellphones remain the most prevalent contemporary form of mobile technology use. In this paper we document ethnographic work studying a remote Mexican village's use of cellphones alongside conventional phones, shared phones and the Internet. While few homes in the village we studied have running water, many children have iPods and the Internet cafe in the closest town is heavily used to access YouTube, Wikipedia, and MSN messenger. Alongside cost, the Internet fits into the communication patterns and daily routines in a way that cellphones do not. We document the variety of communication strategies that balance cost, availability and complexity. Instead of finding that new technologies replace old, we find that different technologies co-exist, with fixed telephones co-existing with instant message, cellphones and shared community phones. The paper concludes by discussing how we can study mobile technology and design for settings defined by cost and infrastructure availability.
Falling asleep with Angry Birds, Facebook and Kindle: a large scale study on mobile application usage BIBAFull-Text 47-56
  Matthias Böhmer; Brent Hecht; Johannes Schöning; Antonio Krüger; Gernot Bauer
While applications for mobile devices have become extremely important in the last few years, little public information exists on mobile application usage behavior. We describe a large-scale deployment-based research study that logged detailed application usage information from over 4,100 users of Android-powered mobile devices. We present two types of results from analyzing this data: basic descriptive statistics and contextual descriptive statistics. In the case of the former, we find that the average session with an application lasts less than a minute, even though users spend almost an hour a day using their phones. Our contextual findings include those related to time of day and location. For instance, we show that news applications are most popular in the morning and games are at night, but communication applications dominate through most of the day. We also find that despite the variety of apps available, communication applications are almost always the first used upon a device's waking from sleep. In addition, we discuss the notion of a virtual application sensor, which we used to collect the data.
Performing a check-in: emerging practices, norms and 'conflicts' in location-sharing using foursquare BIBAFull-Text 57-66
  Henriette Cramer; Mattias Rost; Lars Erik Holmquist
Location-sharing services have a long history in research, but have only recently become available for consumers. Most popular commercial location-sharing services differ from previous research efforts in important ways: they use manual 'check-ins' to pair user location with semantically named venues rather than tracking; venues are visible to all users; location is shared with a potentially very large audience; and they employ incentives. By analysis of 20 in-depth interviews with foursquare users and 47 survey responses, we gained insight into emerging social practices surrounding location-sharing. We see a shift from privacy issues and data deluge, to more performative considerations in sharing one's location. We discuss performance aspects enabled by check-ins to public venues, and show emergent, but sometimes conflicting norms (not) to check-in.
Understanding mobile web and mobile search use in today's dynamic mobile landscape BIBAFull-Text 67-76
  Karen Church; Nuria Oliver
The term mobile Web is changing. Mobile is traditionally associated with on-the-move, portable and dynamic. However, with the advent of smartphones, an increasing number of users are accessing the mobile Internet via their phone while in more stationary and familiar settings, like at home or at work. This shift in the meaning of mobile is having a significant effect on mobile Web behavior. Designing great mobile Web experiences requires a deeper understanding of the information needs, behaviors and underlying motivations of mobile users. As such, the goal of this work is to study this shift and its impact on mobile Internet access, with a view to determining what this means for the future of the mobile Web and in particular mobile search. In this paper we present the results of an online diary and interview study of 18 active mobile Web users over a 4-week period focusing on how, why, where and in what situations people use the mobile Internet and mobile search. Our findings raise a new set of open research questions and point to a number of implications for enriching the experiences of mobile Web users.
Understanding the importance of location, time, and people in mobile local search behavior BIBAFull-Text 77-80
  Jaime Teevan; Amy Karlson; Shahriyar Amini; A. J. Bernheim Brush; John Krumm
People often search for local information (e.g., a restaurant, store, gas station, or attraction) from their mobile device. We show, via a survey of 929 mobile searchers at a large software company, that local searches tend to be highly contextual, influenced by geographic features, temporal aspects, and the searcher's social context. While location was reported to be very important, respondents looked for information about places close to their current location only 40% of the time. Instead, they were often in transit (68% of our searchers) and wanted information related to their destination (27% of searchers), en route to their destination (12%), or near their destination (12%). Additionally, 63% of our participants' mobile local searches took place within a social context and were discussed with someone else. We discuss these findings to present a picture of how location, time, and social context impact mobile local searches.


Exploring display techniques for mobile collaborative learning in developing regions BIBAFull-Text 81-90
  Mohit Jain; Jeremy Birnholtz; Edward Cutrell; Ravin Balakrishnan
The developing world faces infrastructural challenges in providing Western-style educational computing technologies, but on the other hand observes very high cell phone penetration. However, the use of mobile technology has not been extensively explored in the context of collaborative learning. New projection and display technologies for mobile devices raise the important question of whether to use single or multiple displays in these environments. In this paper, we explore two mobile-based techniques for using co-located collaborative game-play to supplement ESL (English as a Second Language) education in a developing region: (1) Mobile Single Display Groupware: a pico-projector connected to a cell phone, with a handheld controller for each child to interact, and (2) Mobile Multiple Display Groupware: a phone for each child. We explore the types of interaction that occur in both of these conditions and the impact on learning outcomes.
EcoChallenge: a race for efficiency BIBAFull-Text 91-94
  Ronald Ecker; Philipp Holzer; Verena Broy; Andreas Butz
Careful use of the limited remaining fossil energy resources is important for both ecological and economical reasons. In addition to technical improvements, fuel consumption of a vehicle is influenced significantly by the driving behavior. Currently, only few in-car user interfaces are trying to promote a more fuel-efficient driving behavior. We propose EcoChallenge, a community- and location-based in-car persuasive game with the goal to motivate and support a behavioral change towards a fuel-saving driving style. We implemented and integrated EcoChallenge in an experimental vehicle and evaluated it in a field study. The results regarding acceleration, deceleration, breaking and coasting show the effectiveness of our approach. In addition, users confirmed a very positive experience with our system.
"Showing off" your mobile device: adult literacy learning in the classroom and beyond BIBAFull-Text 95-104
  Cosmin Munteanu; Heather Molyneaux; Daniel McDonald; Joanna Lumsden; Rock Leung; Hélène Fournier; Julie Maitland
For a very large number of adults, tasks such as reading. understanding, and using everyday items are a challenge. Although many community-based organizations offer resources and support for adults with limited literacy skills. current programs have difficulty reaching and retaining those that would benefit most. In this paper we present the findings of an exploratory study aimed at investigating how a technological solution that addresses these challenges is received and adopted by adult learners. For this, we have developed a mobile application to support literacy programs and to assist low-literacy adults in today's information-centric society. ALEX© (Adult Literacy support application for Experiential learning) is a mobile language assistant that is designed to be used both in the classroom and in daily life in order to help low-literacy adults become increasingly literate and independent. Through a long-term study with adult learners we show that such a solution complements literacy programs by increasing users' motivation and interest in learning, and raising their confidence levels both in their education pursuits and in facing the challenges of their daily lives.
Locast H2Flow: contextual learning through mobile video and guided documentary production BIBAFull-Text 105-108
  Liselott Brunnberg; Pelin Arslan; Amar Boghani; Federico Casalegno; Steve Pomeroy; Zoe Schladow
In this paper we present the design considerations of Locast H2Flow, an educational tool created to guide learning in a local urban space. In particular, we explore the prospect of utilizing mobile devices to scaffold the learning process. Challenged by missions and guided by templates on a mobile phone, the students construct geo-referenced video reportages and documentaries about sustainable water issues within their community. A deployment with a class of high school students in Italy provides initial user feedback on the learning experience and the scaffolding overall of the learning process.
MobiDev: a tool for creating apps on mobile phones BIBAFull-Text 109-112
  Julian Seifert; Bastian Pfleging; Elba del Carmen Valderrama Bahamóndez; Martin Hermes; Enrico Rukzio; Albrecht Schmidt
Currently, the development of mobile applications heavily relies on using conventional computers as development platform. MobiDev enables people in emerging countries without access to a computer but to a cell phone to develop their own locally relevant applications. The goal of the Mo-biDev project is to simplify development and deployment of applications directly on mobile phones. As a first step, we focus on the design of applications and try to support the computer science curriculum in developing countries to bootstrap the mobile developer culture and community. MobiDev allows the creation of graphical user interfaces (GUI) using various concepts. We present the results of a first system evaluation that show how people perceive the concepts for UI creation of MobiDev.
ForceTap: extending the input vocabulary of mobile touch screens by adding tap gestures BIBAFull-Text 113-122
  Seongkook Heo; Geehyuk Lee
We introduce an interaction technique that increases the touch screen input vocabulary by distinguishing a strong tap from a gentle tap without the use of additional hardware. We have designed and validated an algorithm that detects different types of screen touches by combining data from the built-in accelerometer with position data from the touch screen. The proposed technique allows a touch screen input to contain not only the position of a finger contact, but also its type, i.e., whether the contact is a 'Tap' or a 'ForceTap.' To verify the feasibility of the proposed technique we have implemented our detection algorithm in experiments that test cases of single-handed, two-handed, immersive, and on the move usage. Based on the experimental results, we investigate the advantages of using two types of touch inputs and discuss emerging issues. Finally, we suggest a design guideline for applying the proposed technique to touch screen applications, and present possible application scenarios.
Evaluation of mapping functions for one-handed flick operations on a mobile device BIBAFull-Text 123-131
  Jeehea Lee; Donghun Lee; Min K. Chung
With the increasing use of mobile devices with full-touch screens, screen-movement methods, especially those for scrolling, have become critical. Flick is one of the most preferred screen movement methods for scrolling. To improve the usability of flick, this study evaluated various mapping functions for one-handed flick. Mapping functions are models that show the relationship between the velocity of input and the movement of the screen. They were established by combining four types of functions, linear, quadratic, log, and logistic, and three initial screen movement velocities at the maximum control velocity of the flick, 222, 444, and 666 mm/s. After being used to find targets on a contact list, the mapping functions were evaluated by measuring task completion time and ease of use. We recommend the linear function with 444 mm/s and the quadratic function with 444 mm/s or 666 mm/s as the mapping function for flicking.
100,000,000 taps: analysis and improvement of touch performance in the large BIBAFull-Text 133-142
  Niels Henze; Enrico Rukzio; Susanne Boll
Touchscreens became the dominant input device for smartphones. Users' touch behaviour has been widely studied in lab studies with a relative low number of participants. In contrast, we published a game in the Android Market that records the touch behaviour when executing a controlled task to collect large amounts of touch events. Players' task is to simply touch circles appearing on the screen. Data from 91,731 installations has been collected and players produced 120,626,225 touch events. We determined the error rates for different target sizes and screen locations. The amount of data enabled us to show that touch positions are systematically skewed. A compensation function that shifts the users' touches to reduce the amount of errors is derived from the data and evaluated by publishing an update of the game. The independent-measures experiment with data from 12,201 installations and 15,326,444 touch events shows that the function reduces the error rate by 7.79%. We argue that such a compensation function could improve the touch performance of virtually every smartphone user.
The effects of walking speed on target acquisition on a touchscreen interface BIBAFull-Text 143-146
  Joanna Bergstrom-Lehtovirta; Antti Oulasvirta; Stephen Brewster
Studies have reported negative effects of walking on mobile human -- computer interaction when compared to standing or sitting. However, the quantitative relationship between walking speed and user performance is unknown. In the study described here, we varied walking speed on a treadmill and measured effects on discrete aiming movements on a touchscreen interface. Their relationship was found to be non-linear with a local optimum: when walking at 40-80% of one's preferred walking speed (PWS), target acquisition performance plateaus, indicating optimal trade-off between speed and interaction. Accelerometer data showed that, despite increasing hand oscillation, users were able to maintain stable interaction performance at 74% of PWS. Interestingly, this speed coincides with the speed users spontaneously walk when interacting with a mobile device.
The effects of walking, feedback and control method on pressure-based interaction BIBAFull-Text 147-156
  Graham Wilson; Stephen A. Brewster; Martin Halvey; Andrew Crossan; Craig Stewart
This paper presents a study looking into the effects of walking and the use of visual and audio feedback on the application of pressure for linear targeting. Positional and Rate-based control methods are compared in order to determine which allows for more stable and accurate selections, both while sitting and mobile. Results suggest that Rate-based control is superior for both mobile (walking) and static (sitting) linear targeting, and that mobility significantly increases errors, selection time and subjective workload. The use of only audio feedback significantly increased errors and task time for Positional control and static Rate-based control, but not mobile Rate-based control. Despite this, the results still suggest that audio control of pressure interaction while walking is highly accurate and usable.

Context awareness

Design of an intelligible mobile context-aware application BIBAFull-Text 157-166
  Brian Y. Lim; Anind K. Dey
Context-aware applications are increasingly complex and autonomous, and research has indicated that explanations can help users better understand and ultimately trust their autonomous behavior. However, it is still unclear how to effectively present and provide these explanations. This work builds on previous work to make context-aware applications intelligible by supporting a suite of explanations using eight question types (e.g., Why, Why Not, What If). We present a formative study on design and usability issues for making an intelligible real-world, mobile context-aware application, focusing on the use of intelligibility for the mobile contexts of availability, place, motion, and sound activity. We discuss design strategies that we considered, findings of explanation use, and design recommendations to make intelligibility more usable.
Barriers and bridges in the adoption of today's mobile phone contextual services BIBAFull-Text 167-176
  Mauro Cherubini; Rodrigo de Oliveira; Anna Hiltunen; Nuria Oliver
This paper presents ethnographic observations, a diary study and a large-scale quantitative questionnaire (n=395) designed to study the reasons for adoption and refusal of context-aware mobile applications. Through a qualitative study we identify 24 user needs that these applications fulfill and 9 barriers for adoption. We found that for many of the identified needs the end-goal is not that of receiving information, thus complementing work on mobile information needs. Also, this work offers an actionable list of obstacles that prevent contextual services to reach a larger audience. Finally, our findings suggest the opportunity to develop novel mobile applications that fulfill needs in the activity and personal contextual dimensions, and that of developing an application store for feature phones.
Field study of a waiting-time filler delivery system BIBAFull-Text 177-180
  Sumaru Niida; Satoshi Uemura; Hajime Nakamura; Etsuko Harada
As the variety of mobile services increases, the uses of mobile communications become more diverse and the control of service quality from a user experience perspective becomes increasingly important in mobile service design. The quality of the network is one of the critical factors determining mobile service quality. However, quality has mainly been evaluated in objective physical terms, such as delay reduction and increased bandwidth. It is less common to use a human-centered design viewpoint to improve network performance. In this paper, we discuss a waiting-time filler delivery system that actively addresses the human factor to improve the subjective quality of a mobile network. The main contribution of this paper is to quantitatively show the effect of time fillers on waiting. Field experiments show that time fillers can significantly decrease user dissatisfaction with waiting, but that this effect is strongly influenced by user preferences concerning content.
Investigating episodes of mobile phone activity as indicators of opportune moments to deliver notifications BIBAFull-Text 181-190
  Joel E. Fischer; Chris Greenhalgh; Steve Benford
We investigate whether opportune moments to deliver notifications surface at the endings of episodes of mobile interaction (making voice calls or receiving SMS) based on the assumption that the endings collocate with naturally occurring breakpoint in the user's primary task. Testing this with a naturalistic experiment we find that interruptions (notifications) are attended to and dealt with significantly more quickly after a user has finished an episode of mobile interaction compared to a random baseline condition, supporting the potential utility of this notification strategy. We also find that the workload and situational appropriateness of the secondary interruption task significantly affect subsequent delay and completion rate of the tasks. In situ self-reports and interviews reveal complexities in the subjective experience of the interruption, which suggest that a more nuanced classification of the particular call or SMS and its relationship to the primary task(s) would be desirable.
The influence of the spatial separation of control elements on the workload for mobile information systems BIBAFull-Text 191-200
  Jens Ziegler; Markus Graube; Alexander Suhrbier; Niels Wessel; Hagen Malberg; Leon Urbas
Mobile information systems (MIS) are finding their way into private and business every-day activities. There are also increased attempts to establish MIS for on-site activities in industrial facilities. Industrial environments, however, place significantly higher demands on mobile user interfaces than office or home environments. Common interaction styles are often unsuitable for this domain. MIS comprising specialized configurations like wearable systems might overcome current limitations. Wearable systems make it possible to arrange system components in the immediate environment of the user's body in order to create an ergonomic and intuitive user interface. However, the use of distributed, body-worn user interfaces, and in particular separation of input and output devices, might increase the workload for the user. This study examines the extent to which the separation of input and output devices affects the workload for wearable MIS. Three interaction styles in two different configurations are investigated with four different measures to determine the workload covering both objective and subjective indicators. This investigation shows that there is no significant increase of workload in general. However, the measurement of the heart rate variability revealed subtle but significant differences between the two configurations, particularly for one interaction style. These findings indicate that physiological measures can provide more detailed and subtle information about additional workload and its source than other measures.

Indoor navigation

Exploring user preferences for indoor navigation support through a combination of mobile and fixed displays BIBAFull-Text 201-210
  Faisal Taher; Keith Cheverst
In this paper we explore, through a formative study, user preferences for indoor navigation support using a combination of mobile and fixed displays along with a range of navigation content such as digital 2D maps, 3D route visualizations (presented as continuous media from a first person perspective) and graphical directional arrows. It is well-established that visitors within complex building architectures (e.g. hospitals) often face challenges in finding their way and are limited to using traditional static signage or asking others for directions. Recent developments in mobile and pervasive technology however, are enabling a range of possibilities and augmenting the way in which users receive digital navigation support. Here, we discuss a formative study involving 16 participants using the prototype Hermes2 Navigation System in order to inform the development of a useful and usable interactive indoor navigation system.
Handheld augmented reality indoor navigation with activity-based instructions BIBAFull-Text 211-220
  Alessandro Mulloni; Hartmut Seichter; Dieter Schmalstieg
We present a novel design of an augmented reality interface to support indoor navigation. We combine activity-based instructions with sparse 3D localisation at selected info points in the building. Based on localisation accuracy and the users' activities, such as walking or standing still, the interface adapts the visualisation by changing the density and quality of information shown. We refine and validate our design through user involvement in pilot studies. We finally present the results of a comparative study conducted to validate the effectiveness of our design and to explore how the presence of info points affects users' performance on indoor navigation tasks. The results of this study validate our design and show an improvement in task performance when info points are present, which act as confirmation points and provide an overview of the task.
Pedestrian navigation with degraded GPS signal: investigating the effects of visualizing position uncertainty BIBAFull-Text 221-230
  Stefano Burigat; Luca Chittaro
GPS-based pedestrian navigation can be difficult when GPS position readings are inaccurate or unavailable. In this paper, we report on a user study we carried out to investigate whether different visualizations of the uncertainty associated to user position can help users navigate outdoors when the GPS signal is degraded. In the study, we compared a basic visualization that displays only the last accurate position of the user during GPS signal degradation, and two visualizations that dynamically estimate the area where the user might be, displaying it respectively as a circle and as colored street segments. While we did not find any difference among the three visualizations in terms of the accuracy with which users assessed their position, we found that the "streets coloring" visualization required a significantly lower workload compared to the basic visualization and was perceived to be more beneficial by users.

Hedonic life

The secret life of my dog: design and evaluation of paw tracker concept BIBAFull-Text 231-240
  Susanna Paasovaara; Mikko Paldanius; Petri Saarinen; Jonna Häkkilä; Kaisa Väänänen-Vainio-Mattila
What do dogs do while their owners are away? And what do the owners want to know about it? Based on the findings from our earlier research, we created a concept called Paw Tracker for pet dog owners. Paw Tracker utilizes mobile and internet technologies and combines sensor-based dog-created content with a social media approach. The concept was evaluated by potential users in focus groups and in a Wizard-of-Oz based field study. The results of the study revealed the dog owners' detailed interest in tracking their dogs' activities and following their condition in real-time, both in special situations and in everyday life. Dog owners found especially interesting the possibility to discover reasons for barking and misbehavior. Sensor and video feed based information were both considered valuable, and we argue that an ideal design should combine these both. This study and the designed concept strengthen the understanding of the emerging domain of technology mediated human-dog interaction.
Unpacking social interaction that make us adore: on the aesthetics of mobile phones as fashion items BIBAFull-Text 241-250
  Oskar Juhlin; Yanqing Zhang
We report on a study of fashionable people's expressions of opinions on mobile phones in online fashion media, such as blogs and magazines. First, the study contributes to our understanding of the role of pragmatic philosophy, which is now dominating HCI both as a guide for design and as a guide when looking at social practices, in outlining the role of aesthetics in experience design. Fashion practices diverge from this theory, since here aesthetic appearances can be visual, ambiguous and incomplete although it still provides a lot of meanings for people. We argue that our findings should influence the discussion in HCI to consider a less theoretically oriented aesthetic approach, where instead empirical studies get at the forefront. Second, the study provides valuable insight on how we should design mobile experiences to attract more attention from people interested in fashion. Mobile phones, and their services, can for example be designed to relate to the visual appearance of the dressed outfit, or ensemble of a person.
The hybrid shopping list: bridging the gap between physical and digital shopping lists BIBAFull-Text 251-254
  Felix Heinrichs; Daniel Schreiber; Johannes Schöning
Shopping is one of the most frequently occuring tasks in our daily lives, and creation and management of shopping lists is an important aspect of this task. Given the recent adoption of mobile devices, the process of writing lists is not only limited to the use of pen and paper, as a good number of digital tools and applications are available. The goal of this paper is to study and understand the transition between paper-based and digital shopping lists. We analyze how people interact with paper-based shopping lists and derive design implications for our own hybrid shopping support application, which combines paper-based lists with a mobile application. We contribute the study and the design and implementation of a hybrid (pen-and-paper-based UI and mobile GUI) application for the creation of shopping lists.

Outdoor navigation

MirrorMap: augmenting 2d mobile maps with virtual mirrors BIBAFull-Text 255-264
  Carmen E. Au; Victor Ng; James J. Clark
In this paper, we describe our MirrorMap system, a 2D mobile map system that is augmented with live videos. In most major modern cities, traffic cameras and publicly accessible webcams are in abundance. These cameras provide live coverage of given city and are often positioned such that they have superior views of the scene. As such, it would be beneficial if a person who is wayfinding could have access to these video feeds, to acquire greater information about the area as they are route planning. Moreover, it would be beneficial if the system could provide the video feeds in a natural and familiar way that maintains the spatial relationships between the position of the corresponding camera and the person. We adopt a method called Virtual Mirroring. For each camera source, we place a virtual mirror in its stead. The mirror reflects the feed from the camera, and what results is the appearance that there are mirrors located in the position of the cameras. Akin to the well placed mirrors in convenience stores that provide shopkeepers with views of aisles he or she would not normally be able to see from the cash register, the user can point her (or his) mobile device in the direction of the source camera and see the virtual mirrors that are displaying the video feed. By so doing, the user can see additional views of the environment she would not normally be able to see from where she is standing. This additional information can inform her route planning.
Augmented reality vs. street views: a driving simulator study comparing two emerging navigation aids BIBAFull-Text 265-274
  Zeljko Medenica; Andrew L. Kun; Tim Paek; Oskar Palinko
Prior research has shown that when drivers look away from the road to view a personal navigation device (PND), driving performance is affected. To keep visual attention on the road, an augmented reality (AR) PND using a heads-up display could overlay a navigation route. In this paper, we compare the AR PND, a technology that does not currently exist but can be simulated, with two PND technologies that are popular today: an egocentric street view PND and the standard map-based PND. Using a high-fidelity driving simulator, we examine the effect of all three PNDs on driving performance in a city traffic environment where constant, alert attention is required. Based on both objective and subjective measures, experimental results show that the AR PND exhibits the least negative impact on driving. We discuss the implications of these findings on PND design as well as methods for potential improvement.
ZebraLocalizer: identification and localization of pedestrian crossings BIBAFull-Text 275-284
  Dragan Ahmetovic; Cristian Bernareggi; Sergio Mascetti
Independent mobility in unfamiliar environments is a significant challenge for people with severe vision impairment. Among other problems, one specific issue concerns the identification of those road signs that can be recognized by sight only. In this paper we present ZebraLocalizer, an application for mobile devices that identifies zebra crossings and guides the user towards them. Two main problems are discussed in this contribution: the identification and localization of the crosswalks, performed by processing data acquired both from the camera and the accelerometers, and the design of an interaction paradigm specifically addressed to blind users. Experimental results, conducted both on a dataset of images and with blind users, validate the applicability of the proposed solution.
Navigating the world and learning to like it: mobility training through a pervasive game BIBAFull-Text 285-294
  Charlotte Magnusson; Annika Waern; Kirsten Rassmus Gröhn; Åse Bjernryd; Helen Bernhardsson; Ann Jakobsson; Johan Salo; Magnus Wallon; Per-Olof Hedvall
This paper introduces the idea that location based pervasive games can be used to make mobility training for visually impaired children more fun. The user centred development process which has been carried out in collaboration with both visually impaired children and rehabilitation staff is described and we present a novel game concept which combines locative play, sound traces and a physical catch movement. We report and discuss results of user tests and summarize our experience in a set of tentative development and design guidelines for this type of game.

Text and keyboards

A versatile dataset for text entry evaluations based on genuine mobile emails BIBAFull-Text 295-298
  Keith Vertanen; Per Ola Kristensson
Mobile text entry methods are typically evaluated by having study participants copy phrases. However, currently there is no available phrase set that has been composed by mobile users. Instead researchers have resorted to using invented phrases that probably suffer from low external validity. Further, there is no available phrase set whose phrases have been verified to be memorable. In this paper we present a collection of mobile email sentences written by actual users on actual mobile devices. We obtained our sentences from emails written by Enron employees on their BlackBerry mobile devices. We provide empirical data on how easy the sentences were to remember and how quickly and accurately users could type these sentences on a full-sized keyboard. Using this empirical data, we construct a series of phrase sets we suggest for use in text entry evaluations.
Script-agnostic reflow of text in document images BIBAFull-Text 299-302
  Saurabh Panjwani; Abhinav Uppal; Edward Cutrell
Reading text from document images can be difficult on mobile devices due to the limited screen width available on them. While there exist solutions for reflowing Latin-script texts on such devices, these solutions do not work well for images of other scripts or combinations of scripts, since they rely on script-specific characteristics or OCR. We present a technique that reflows text in document images in a manner that is agnostic to the script used to compose them. Our technique achieved over 95% segmentation accuracy for a corpus of 139 images containing text in 4 genetically-distant languages-English, Hindi, Kannada and Arabic. A preliminary user study with a prototype implementation of the technique provided evidence of some of its usability benefits.
Understanding information preview in mobile email processing BIBAFull-Text 303-312
  Kimberly A. Weaver; Huahai Yang; Shumin Zhai; Jeff Pierce
Browsing a collection of information on a mobile device is a common task, yet it can be difficult due to the small size of mobile displays. A common trade-off offered by many current mobile interfaces is to allow users to switch between an overview and detailed views of particular items. An open question is how much preview of each item to include in the overview. Using a mobile email processing task, we attempted to answer that question. We investigated participants' email processing behaviors under differing preview conditions in a semi-controlled, naturalistic study. We collected log data of participants' actual behaviors as well as their subjective impressions of different conditions. Our results suggest that a moderate level of two to three lines of preview should be the default. The overall benefit of a moderate amount of preview was supported by both positive subjective ratings and fewer transitions between the overview and individual items.
Solving the great Indian text input puzzle: touch screen-based mobile text input design BIBAFull-Text 313-322
  Younghee Jung; Dhaval Joshi; Vijay Narayanan-Saroja; Deepak Prabhu Desai
This paper shares the background, design goals, the prototype, and the results of the very first acceptance study of a new Indic text input system based on a touch screen mobile phone, primarily targeted to cater for the majority of population who are not literate in English, or never used Indic text input on mobile phones. The proposed design resolves the complexity associated with consonant modification, applicable to major Indic scripts by providing visual array of modified consonants upon the user's selection of a consonant. 60 participants went through the prepared performance test on the proposed input system and they all showed the improvement of performance after typing 16 sentences. The familiarity of the consonant layout and the ease of figuring out how the input worked played a key role in getting user acceptance. Future work is required to refine the design and further investigate on the acceptance among those who have lower level of literacy, and the comparison with other Indic input methods.
Design and evaluation of Devanagari virtual keyboards for touch screen mobile phones BIBAFull-Text 323-332
  Anirudha Joshi; Girish Dalvi; Manjiri Joshi; Prasad Rashinkar; Aniket Sarangdhar
Lack of an easy and efficient text input mechanism in Indic scripts has been a barrier to large-scale adoption of ICTs in India. We present findings from a usability evaluation of three keyboard designs for Indic scripts for touch screen phones. The design of one of the keyboards is based on the frequency of characters, while the designs of the other two are based on the logical structure of the script. We evaluated the keyboards with participants with low-level of education through a first-time usability test and a longitudinal usability test. One of the logically structured keyboards started out with significantly higher success rate, typing speed, and lesser errors than the other two. The longitudinal test involving text input of 500 words did not conclusively prove that either design was better. Our study establishes benchmarks for text input speeds, errors and ratings for the initial learning phase for text input in Marathi among less educated users.

Projection and visualizations

PoCoMo: projected collaboration using mobile devices BIBAFull-Text 333-336
  Roy Shilkrot; Seth Hunter; Patricia Maes
As personal projection devices become more common they will be able to support a range of exciting and unexplored social applications. We present a novel system and method that enables playful social interactions between multiple projected characters. The prototype consists of two mobile projector-camera systems, with lightly modified existing hardware, and computer vision algorithms to support a selection of applications and example scenarios. Our system allows participants to discover the characteristics and behaviors of other characters projected in the environment. The characters are guided by hand movements, and can respond to objects and other characters, to simulate a mixed reality of life-like entities.
Prochinima: using pico projector to tell situated stories BIBAFull-Text 337-346
  Panu Åkerman; Arto Puikkonen
In this paper we explore the use of Prochinima, a mobile and ubiquitous storytelling tool concept utilizing small projectors and ready-made animations. We tested Prochinima in a field trial, where 6-10 year old children were using the system and creating stories, as well as capturing the stories on video. In addition to analyzing Prochinima's ability to support children's storytelling and collaboration, we explore the affect of mobility and context on storytelling. Moreover, we discuss the importance of creativity and supporting user's own content creation. We also argue that fun and playfulness is connected to the freedom of creativity.
An exploratory study on the use of camera phones and pico projectors in rural India BIBAFull-Text 347-356
  Akhil Mathur; Divya Ramachandran; Edward Cutrell; Ravin Balakrishnan
We explore the potential of using camera phones and pico projectors in rapid creation and presentation of digital content in a development context. A camera phone based content authoring application was designed and deployed with three different user populations in the domains of classroom education and health care. Our findings show that despite the variations in education levels, cultural background, and technology exposure, users successfully created and presented different forms of digital content using the camera phone and pico projector.
Evaluating depth illusion as method of adding emphasis in autostereoscopic mobile displays BIBAFull-Text 357-360
  Jussi Huhtala; Minna Karukka; Marja Salmimaa; Jonna Häkkilä
In this paper we evaluate the possibilities of autostereo-scopic three-dimensional displays in aiding the user in a selection task on a mobile touch-screen user interface. We describe a user study which measures the effectiveness of realistic depth illusion as an emphasizing method in a conventional mobile user interface concept. In our experiment, 35 people completed simple thumbnail visual search tasks, where horizontal disparity, color shading, and their combination were used as the emphasizing method. The results indicate that using disparity alone as a visual indicator does not provide enough support in find-and-select tasks to improve the user performance or perceived comfort with the task. However, when combined with other visual cues, disparity can significantly improve performance and satisfaction.
Content splitting & space sharing: collaboratively reading & sharing children's stories on mobile devices BIBAFull-Text 361-370
  Jerry Alan Fails; Allison Druin; Mona Leigh Guha
This paper addresses how children can collaborate by leveraging the ubiquity of mobile devices. Specifically we investigate how children (ages 8-9) read and share children's stories using two collaborative configurations: content splitting and space sharing. Content splitting is where interface pieces (e.g. words, pictures) are split between two or more devices. Space sharing is where the same content (e.g. a document) is spread or shared across devices. The results point to an overall preference for the content splitting configuration. Supporting collaborative configurations on mobile devices can help overcome one of the most significant usability issues these devices face -- their limited screen space.

Methods and prototyping

Naturalistic enactment to stimulate user experience for the evaluation of a mobile elderly care application BIBAFull-Text 371-380
  Luis A. Castro; Jesus Favela; Carmen García-Peña
We describe the use of a technique aimed at allowing users to realistically experience novel mobile applications in a naturalistic environment as part of the formative evaluation of the system. This technique, that we have named naturalistic enactment, is appropriate for application scenarios that are not easily replicated in a laboratory setting and might cause risks to users or other stakeholders if used at a prototyping stage, as is often the case in the healthcare domain. We present and discuss the use of this technique after the recent introduction of a mobile application aimed at assisting geriatric nurses in attending emergency calls. The technique provided nurses with the experience of operating the mobile application in a naturalistic environment (e.g., high ecological validity), thus allowing them to identify adoption issues that they would have missed otherwise.
People-centric mobile sensing with a pragmatic twist: from behavioral data points to active user involvement BIBAFull-Text 381-384
  Jan Blom; Daniel Gatica-Perez; Niko Kiukkonen
Mobile phones have recently been used to collect largescale continuous data about human behavior. This people centric sensing paradigm is useful not only from a scientific point of view: Contextual user data has pragmatic value, too. Individuals whose data is collected in such long-term people centric sensing projects can be engaged in user centric design activities aiming to generate data driven services that benefit the end user. This paper demonstrates the value of such user centric approach. In a two-stage approach, we analyse mobile phone data to extract mobile phone usage categories. We then go on to interview the participants concerning their perceptions toward context-aware services. The two stages, combined as we present here, offer a clear value in terms of providing complementary insights, both to researchers and users, about the feasibility of and the expectations about personalized mobile services.
Experience characters: a design tool for communicating mobile phone experiences to designers BIBAFull-Text 385-394
  Marianna Obrist; Elke Beck; Daniela Wurhofer; Manfred Tscheligi
Different methods, techniques and tools exist for supporting design activities in a user-centered design process. Due to the increasing relevance of experience-centered design, the need to advance existing design methods and tools becomes evident. Within this paper we present "experience characters" as a design tool for communicating a richer understanding of mobile phone experiences towards designers. We conducted a qualitative text analysis study of written experience reports from an online forum in the tradition of grounded theory. Major experience types were grouped into categories and further transformed into five fictive characters that are easy to remember and likely to invoke empathy in design. Thus, designers' communication on and understanding of mobile phone experiences are supported. We describe in detail the steps taken for analysing experience reports and developing the experience characters, including relevant properties for informing and supporting an experience-centered design process.
Ubiquitous sketching for social media BIBAFull-Text 395-404
  Lisa G. Cowan; Nadir Weibel; Laura R. Pina; James D. Hollan; William G. Griswold
Digital social media have transformed how we communicate and manage our relationships. Despite its portability, sketching as a social medium has been largely left behind. Given sketching's unique affordances for visual communication this absence is a real loss. Sketches convey visuo-spatial ideas directly, require minimal detail to render concepts, and show the peculiarities of handwriting. Sketching holds the promise to enrich how we communicate, and its ubiquity is critical for sharing information at opportune moments. We present the results of an exploratory field study of ubiquitous sketching for social media, documenting users' experiences with UbiSketch. This system integrates digital pens, paper, and mobile phones to support the transmission of paper sketches to online services. We learned that UbiSketch enabled participants to leverage sketching's unique affordances, that ubiquitous sketching creates a synergy with the practice of posting context-dependent information, and that it broadens and deepens social interaction.
Sketching in software and hardware Bluetooth as a design material BIBAFull-Text 405-414
  Petra Sundström; Alex S. Taylor; Kenton O'Hara
In any design process, a medium's properties need to be considered. This is generally well established, yet still within interactive systems design, the properties of a technological medium are often glossed over. That is, technologies are often black-boxed without much thought given to how their distinctive material properties open up the design space. In this paper, we experiment with a technology to see what might be gained from intentionally and systematically investigating its properties. Specifically, we look upon Bluetooth from the perspective of being a design material and examine how its properties from that perspective can be used to shape design thinking. Using four example cases or "sketches", we show that Bluetooth's properties, often seen as constraints, can provide useful building blocks for designing interactive systems.

Video interaction

Visual reporting in time-critical work: exploring video use in emergency response BIBAFull-Text 415-424
  Fredrik Bergstrand; Jonas Landgren
This paper reports on an explorative project aimed to study the use of live video technology in emergency response work. The initial stage of the project aimed at enabling an emergency response organization with live video capabilities. The study covered the steps of design, development and deployment of an application for live video broadcasting. Over a 10 months period, professional responders has used the application in over 200 incidents. The study shows how short video sequences are produced as an embedded activity in order to capture small fragments of work rather than creating a complete coverage of an incident. Further, this study also shows how broadcasted video is incorporated into the work at the command center as visual reports, which open up for collective negotiations of the broader meaning of a situation.
Live-streaming mobile video: production as civic engagement BIBAFull-Text 425-434
  Audubon Dougherty
Live-streaming mobile video is an emerging medium. Few have measured how this new form of production is contributing to civic engagement or broadening the public sphere by circulating visual footage of community interest. Here I explore the overall trends in production of live-streaming mobile video from producers around the world, and focus more narrowly on the motivations and practices surrounding the production of civic content. Informing my study are mobile videos on Qik.com, a popular web service for live-streaming mobile video. I offer a quantitative content analysis of 1,000 videos, summarizing general trends in content production as well as analyzing the motivations behind the production of civic content, based on qualitative interviews with frequent producers. Results indicate that production is higher among those who self-identify as activists, journalists, community leaders or educators, suggesting this new medium can be best appropriated by those who are already civically engaged.
Crowdsourced news reporting: supporting news content creation with mobile phones BIBAFull-Text 435-444
  Heli Väätäjä; Teija Vainio; Esa Sirkkunen; Kari Salo
As news organizations are moving towards systematically using the power of crowds in news reporting, mobile phones are potential mobile tools for reader reporters. We conducted two user studies to support the development of future mobile crowdsourcing processes and mobile tools for news reporting. In a quasi-experiment on future mobile crowdsourcing process with location-based assignments, SMS messages were experienced as an easy and handy means for news assignments. A customized mobile client prototype was preferred for submission of multimedia content (photo and video), since submission was experienced simple to use and reliable especially for videos. Based on our findings and earlier research we discuss implications for the development of mobile crowdsourcing processes with mobile news reporting assignments.

Work and security

Smart phone use by non-mobile business users BIBAFull-Text 445-454
  Patti Bao; Jeffrey Pierce; Stephen Whittaker; Shumin Zhai
The rapid increase in smart phone capabilities has introduced new opportunities for mobile information access and computing. However, smart phone use may still be constrained by both device affordances and work environments. To understand how current business users employ smart phones and to identify opportunities for improving business smart phone use, we conducted two studies of actual and perceived performance of standard work tasks. Our studies involved 243 smart phone users from a large corporation. We intentionally chose users who primarily work with desktops and laptops, as these "non-mobile" users represent the largest population of business users. Our results go beyond the general intuition that smart phones are better for consuming than producing information: we provide concrete measurements that show how fast reading is on phones and how much slower and more effortful text entry is on phones than on computers. We also demonstrate that security mechanisms are a significant barrier to wider business smart phone use. We offer design suggestions to overcome these barriers.
Beyond 'yesterday's tomorrow': towards the design of awareness technologies for the contemporary worker BIBAFull-Text 455-464
  Jason Wiese; Jacob T. Biehl; Thea Turner; William van Melle; Andreas Girgensohn
Modern office work practices increasingly breach traditional boundaries of time and place, increasing breakdowns workers encounter when coordinating interactions with colleagues. We conducted interviews with 12 workers and identified key problems introduced by these practices. To address these problems we developed myUnity, a fully functional platform enabling rich workplace awareness and coordination. myUnity is one of the first integrated platforms to span mobile and desktop environments, both in terms of access and sensing. It uses multiple sources to report user location, availability, tasks, and communication channels. A pilot field study of myUnity demonstrated the significant value of pervasive access to workplace awareness and communication facilities, as well as positive behavioral change in day-to-day communication practices for most users. We present resulting insights about the utility of awareness technology in flexible work environments.
On the need for different security methods on mobile phones BIBAFull-Text 465-473
  Noam Ben-Asher; Niklas Kirschnick; Hanul Sieger; Joachim Meyer; Asaf Ben-Oved; Sebastian Möller
Mobile phones are rapidly becoming small-size general purpose computers, so-called smartphones. However, applications and data stored on mobile phones are less protected from unauthorized access than on most desktop and mobile computers. This paper presents a survey on users' security needs, awareness and concerns in the context of mobile phones. It also evaluates acceptance and perceived protection of existing and novel authentication methods. The responses from 465 participants reveal that users are interested in increased security and data protection. The current protection by using PIN (Personal Identification Number) is perceived as neither adequate nor convenient in all cases. The sensitivity of data stored on the devices varies depending on the data type and the context of use, asking for the need for another level of protection. According to these findings, a two-level security model for mobile phones is proposed. The model provides differential data and service protection by utilizing existing capabilities of a mobile phone for authenticating users.

Interacting off-screen, on-site and remote

MyState: sharing social and contextual information through touch interactions with tagged objects BIBAFull-Text 475-484
  Robert Hardy; Enrico Rukzio; Paul Holleis; Matthias Wagner
Sharing social and contextual information via services like Facebook, Twitter or Foursquare has become extremely popular in the recent years. This paper introduces the novel MyState concept in which users can augment any kind of object with Near Field Communication (NFC) tags, can write any social or contextual information on those tags using their mobile phones and can publish this information on a social networking site just by touching such a tag with their phone. The distinct features of MyState are A) the possibility to augment any personal or public object with any contextual or social information, B) the possibility that everybody can touch those tags in order to post the related information to a social networking site, C) the speed and convenience to publish information by a simple touch as users don't have to look at the mobile phone screen, interact with mobile phone menus or write any text when touching an already deployed tag. The paper reports on two field studies which provide insights on where the participants placed the tags, how they used MyState and what type of information was shared. Here we observed that users typically shared identity, location, activity and time, but also feelings, social meanings and experiences. Furthermore we identified several distinct social usage patterns such as synchronizing activities, expressing moods, games and tracking shared items.
Characterizing user performance with assisted direct off-screen pointing BIBAFull-Text 485-494
  Barrett Ens; David Ahlström; Andy Cockburn; Pourang Irani
The limited viewport size of mobile devices requires that users continuously acquire information that lies beyond the edge of the screen. Recent hardware solutions are capable of continually tracking a user's finger around the device. This has created new opportunities for interactive solutions, such as direct off-screen pointing: the ability to directly point at objects that are outside the viewport. We empirically characterize user performance with direct off-screen pointing when assisted by target cues. We predict time and accuracy outcomes for direct off-screen pointing with existing and derived models. We validate the models with good results (R² ≥ 0.9) and reveal that direct off-screen pointing takes up to four times longer than pointing at visible targets, depending on the desired accuracy tradeoff. Pointing accuracy degrades logarithmically with target distance. We discuss design implications in the context of several real-world applications.
Proximal and distal selection of widgets: designing distributed UI for mobile interaction with large display BIBAFull-Text 495-498
  Umar Rashid; Jarmo Kauko; Jonna Häkkilä; Aaron Quigley
A smartphone having touchscreen and short-range networking facilities makes efficient remote control for a large display. In this paper, we report on the results of a case study examining the user performance of Proximal Selection (PS) and Distal Selection (DS) of remote control widgets. DS uses a mobile pointer to zoom-in the region of interest and select the widgets on the large display. PS involves pointing at the large display to transfer the zoom-in view of the pointed region onto the mobile touchscreen and make selections thereafter. The experimental results indicate that PS outperforms DS in terms of speed and user satisfaction with physical effort involved especially in complex tasks requiring multiple widget selection. DS was found to be favorable for simple tasks as it has lower error rate and it does not require attention switch between the mobile and the large display.
"Can we work this out?": an evaluation of remote collaborative interaction in a mobile shared environment BIBAFull-Text 499-502
  Dari Trendafilov; Yolanda Vazquez-Alvarez; Saija Lemmelä; Roderick Murray-Smith
We describe a novel dynamic method for collaborative virtual environments designed for mobile devices and evaluated in a mobile context. Participants interacted in pairs remotely and through touch while walking in three different feedback conditions: 1) visual, 2) audio-tactile, 3) spatial audio-tactile. Results showed the visual baseline system provided higher shared awareness, efficiency and a strong learning effect. However, and although very challenging, the eyes-free systems still offered the ability to build joint awareness in remote collaborative environments, particularly the spatial audio one. These results help us better understand the potential of different feedback mechanisms in the design of future mobile collaborative environments.
The phone rings but the user doesn't answer: unavailability in mobile communication BIBAFull-Text 503-512
  Antti Salovaara; Antti Lindqvist; Tero Hasu; Jonna Häkkilä
We know that phone calls and mobile text messages are not always promptly answered and responded to, yet we know little about the reasons for unavailability, its effects on a user's image, the ways in which users explain the reasons for it, and actions when users cannot reach someone. Usage logs (2,983 phone use events), Web-based diaries, and interviews (N = 20) were used to investigate occasional unavailability in a mobile communication context. We identified four categories of unavailability and found that 31.1% of the phone calls consisted of unsuccessful communication attempts and reciprocal calls back from people who were unavailable earlier. Interestingly, while participants paid attention to the need to give reasons for unavailability, they did not require the explanations to be truthful. These findings have implications for design of systems that better support the needs to manage and explain unavailability and manage pending communication requests.
Exploring serendipitous social networks: sharing immediate situations among unacquainted individuals BIBAFull-Text 513-516
  Hyukjae Jang; Sungwon Peter Choe; Junehwa Song
In this article, we present a serendipitous social network that makes use of individuals' contextual information to connect people in shared immediate situations. We build a prototype microblogging service that uses a serendipitous social network as a substrate. We performed an initial user study and found that users engaged in shared immediate situations are likely to exchange posts relevant to their situation that provide useful information or reflect emotional understanding.

Industrial case studies

User experience in speech recognition of navigation devices: an assessment BIBAFull-Text 517-520
  Areti Goulati; Dalila Szostak
The complex acoustic environment of car interiors lowers the performance of speech recognition of navigation systems. Interaction designers are challenged with the difficult task of creating clever ways to recover from errors. How do these affect the overall user experience?
   To answer the question, a benchmark study was conducted on 3 commercially available navigation devices. The result is a set of recommendations for speech interface design of navigation devices.
Mobile application for utility domains BIBAFull-Text 521-524
  Jacqueline Tappan; Mary L. Cummings; Christine Mikkelsen; Ken Driediger
This research, a collaboration between MIT and ABB/Ventyx, is focused on the development of a mobile interface for field workers in power repair settings and field service delivery. A Human Systems Engineering (HSE) approach of Plan, Analyze, and Design was utilized to develop the interface, which included a Hybrid Cognitive Task Analysis (hCTA) that identified requirements for the envisioned interface. This paper overviews the results of the HSE process and presents a preliminary design for the mobile interface that emerged during initial display prototyping.
Feedback on the definition and design of innovative mobile services BIBAFull-Text 525-528
  Lou Schwartz; Alain Vagner; Sylvain Kubicki; Thomas Altenburger
How to define users' needs for a new prototype? That is the question addressed in this paper, through the design of a mobile prospective prototype to support architects during the construction sites' visits. We propose to involve business experts and to use the agile-UX method to support our approach.
Using physical objects to enable enriched video communication BIBAFull-Text 529-532
  Marcus Nyberg; Cristian Norlin; Peter Gomez
This paper describes an exploratory concept for how video communication can address the potential collaboration opportunities (and challenges) that arise with an emerging networked society in which the "material" to be used in the collaboration no longer is restrained to simple presentations, but can include services, Internet enabled objects, and many other types of systems and features. The concept illustrates how tangible objects can be utilized as props for the interaction and collaboration, and as access points to services, functionality and information. The findings from a qualitative user study suggest that this contributes to creating a form of collaboration in which technology is less visible and the actual meeting between humans becomes more significant. The user study also showed the importance of security and trust for such a system to work.


Oh music, where art thou? BIBAFull-Text 533-538
  Matthijs Zwinderman; Tanya Zavialova; Daniel Tetteroo; Paul Lehouck
We propose a novel concept for navigation for bicyclists based on 3D-audio. The concept has been implemented in a prototype and evaluated in a small user test. The results indicate a hopeful future for audio-based navigation for bicyclists.
Markerless visual fingertip detection for natural mobile device interaction BIBAFull-Text 539-544
  Matthias Baldauf; Sebastian Zambanini; Peter Fröhlich; Peter Reichl
The vision-based detection of hand gestures is one technological enabler for Natural User Interfaces which try to provide a natural and intuitive interaction with computers. In particular, mobile devices might benefit from such a less device-centric but more natural input possibility. In this paper, we introduce our ongoing work on the visual markerless detection of fingertips on mobile devices. Further, we shed light on the potential of mobile hand gesture detection and present several promising use cases and respective demo applications based on the presented engine.
TouchOver map: audio-tactile exploration of interactive maps BIBAFull-Text 545-550
  Benjamin Poppinga; Charlotte Magnusson; Martin Pielot; Kirsten Rassmus-Gröhn
This article reports on a preliminary study, which investigates if vibration and speech feedback can be used in order to make a digital map on a touch screen device more accessible. We test if vibration feedback combined with speech, triggered as the finger moves over relevant map objects, works to make sense of the map content. The study results indicate that it is indeed possible to get a basic overview of the map layout even if a person does not have access to the visual presentation. In the conclusions the interaction problems are identified and suggestions for future improvements are given.
Legible thumbnail: summarizing on-line handwritten documents based on emphasized expressions BIBAFull-Text 551-556
  Hiroki Asai; Takanori Ueda; Hayato Yamana
In recent years, digital notebooks have been replacing traditional paper-based notebooks with the development of handwriting input devices. Currently, we can access digital notebooks in various devices, including mobile devices. When we use such mobile devices, however, their limited screen size results in difficulty in understanding the summary of hand-written documents, without the use of a zoom feature. In this paper, we therefore propose the "Legible Thumbnail" that helps us to understand the summary without zooming. Our method detects the important words based on emphasis, such as an underline, and the method outputs the emphasized words to the thumbnail. Experiments show our thumbnail reduces search time by 21%.
A mobile guide for serendipitous exploration of cities BIBAFull-Text 557-562
  Eva Hornecker; Stuart Swindells; Mark Dunlop
In this paper we describe the design concept, prototype development, and initial findings for a mobile guide supporting serendipitous exploration of a city. The system will allow a tourist to freely explore a new city, while providing them with peace of mind to not accidentally walk past attractions they desire to see. We have developed a proximity model and vibration patterns for alerts, and devised ways of mitigating tradeoffs between battery use and location accuracy.
Walky for embodied microblogging: sharing mundane activities through augmented everyday objects BIBAFull-Text 563-568
  Elena Nazzi; Tomas Sokoler
In this paper we present our ongoing exploration of a theoretical concept: Embodied Microblogging. Looking for a more situated way to communicate mundane activities in local communities, EM informs the design of digital technology to facilitate senior citizens in making their everyday activities noticeable and create more openings for social interactions in their local communities. We use Walky, a design sketch based on walking and walking objects, to exemplify the design space emerging from EM. Investigating EM and putting on display a concrete design example, we contribute to the interaction design research community looking at social well being in Aging in place suggesting EM as informing theoretical concept for designing digital technology for social interaction.
Pathlight: supporting navigation of small groups in the museum context BIBAFull-Text 569-574
  Alan J. Wecker; Joel Lanir; Tsvi Kuflik; Oliviero Stock
In this paper we describe the in-progress work of the Pathlight navigation system for groups and individuals. Pathlight provides indoor navigation support in the museum using a handheld projector. We describe some of the advantages the system provides, look at some background, briefly describe some system features, and posit some open questions for further investigation.
3D interaction using mobile device on 3D environments with large screen BIBAFull-Text 575-580
  Dongwoo Lee; Jae-In Hwang; Gerard J. Kim; Sang Chul Ahn
There are growing needs for the 3D environment. The 3D devices, such like 3D television, have already started to be sold in markets to the general public. In other words, it means that 3D environment is becoming accustomed to everything. In the same manner, the ways for the 3D manipulation are also getting more important, and there are lots of ways for the 3D manipulation on large screen, which has been used so far, such like 6DOF tracking devices. The problem, however, is that they require high cost, and appropriate place with good environment should be prepared. In this paper, another easier and cheaper way for the 3D manipulation is introduced. The method which is introduced is a way using a mobile device instead of the expensive trackers. Using the method that this paper shows, it is expected that this method is accustomed to everybody and can be used easily.
Release your app on Sunday eve: finding the best time to deploy apps BIBAFull-Text 581-586
  Niels Henze; Susanne Boll
Mobile application stores such as Apple's App Store and Google's Android Market enable researchers to use App Stores as a platform for user studies. Because successful studies can require a large number of users researchers might need to attract a large audience. The right timing when releasing or updating apps can considerably increase the number of installations. Using a game that is published in the Android Market we analyze when people install games. Furthermore, we determine when developers deploy games in the Android Market. We combine data from 157,438 installations of the game and the observation of 24,647 published apps. Our results suggest that the best time to deploy a game in the Android Market is on Sunday evening GMT.
Bottom billion architecture: a generic software architecture for ICTD use case scenarios BIBAFull-Text 587-592
  Joerg Doerflinger; Tom Gross
The ICT for Development (ICTD) research field still lacks a generic architecture approach unifying software development tasks in ICTD research. With a common approach technical ICTD research could evolve from single point solutions narrowed to one specific use case or technology towards a shared approach following the common goal of providing ICT access in ICTD use case scenarios. In this paper we present the replication of the Bottom Billion Architecture (BBA) in a second use case scenario. The BBA was developed and evaluated in a procurement use case in South Africa. The replication takes place in a cashew supply chain in Ghana whose current inefficient paper based organization hampers collaborative business with the established economy. The BBA prototype has been deployed for a five month pilot phase with about 400 participating cashew farmers. With this successful replication of the same architecture in two different use case scenarios we are now able to evaluate its capabilities to serve as a generic architecture for various technical ICTD use case scenarios.
Mobile technology keeping people with dementia independent and socially active BIBAFull-Text 593-598
  Stefan Göllner; Jörn Hurtienne; Ulrike Gollner; Anja B. Naumann
A main issue of people with dementia and older people in general is the feeling of being dependent on others and being lonely at the same time. Based on a preventive approach of supporting these people in their independent living and social interaction, two design concepts for mobile technology for this support have been developed and are introduced in this paper: The Shared diary helps carrying out every day tasks by sharing daily task-lists. Meet me is collaborative tool for managing meetings.
Restyling website design via touch-based interactions BIBAFull-Text 599-604
  Luis A. Leiva
This paper introduces an ongoing technique for dynamically updating presentational attributes of UI elements. Starting from an existing web layout, the webmaster specifies what elements are candidates to be modified. Then, touch-based events are used as implicit inputs to an adaptive engine that automatically modifies, rearranges, and restyles the interacted items according to browsing usage. In this way, the UI is capable of (incrementally) adapting itself to the abilities of individual users at run-time.
The mighty un-touchables: creating playful engagement on media façades BIBAFull-Text 605-610
  Matthias Böhmer; Sven Gehring; Markus Löchtefeld; Morin Ostkamp; Gernot Bauer
In this paper we investigate interaction with a media façade that is out of reach for touch-based interaction. We describe four different applications that utilize mobile devices to enable passers-by to interact with the façade. Each application has been designed constrained by limitations given by formal regulations of an editorial board (e.g. to prevent traffic distractions) and with the aim to catch the attention of passers-by to achieve interaction and keep the users engaged. Besides the description of the design and implementation of the different application, we report on initial feedback of users after a first preliminary user test that informs further development and design.
Context tags: exploiting user-given contextual cues for disambiguation BIBAFull-Text 611-616
  Matthias Böhmer; Gernot Bauer; Antonio Krüger
Most context-aware systems rely on physical sensors. To some extent, these systems are able to reason about a user's situation by means of the measured data. However, their overall uncertainty in modeling human behavior leads to ambiguity. A language for the mediation of context information between the user and the system is required to enable the user to adjust the machine's interpretation of his context. We describe how keywords that are attributed to activities by users themselves can act as such a mediator. We present results of a study that investigates the nature of this context attributes. The results demonstrate that different users use similar keywords to describe similar situations and different keywords to describe different situations. Therefore, algorithms developed to evaluate the semantic relatedness of tags and resources within folksonomies can be applied to exploit knowledge about the users' contexts from the keywords they have assigned. We discuss a prototype that recommends mobile services by enhancing its model with the described keywords.
Much: presenting Roman church music in hand-held, locative hyper-audio BIBAFull-Text 617-622
  Anders Fagerjord
A location-aware iPhone application presenting music in Roman churches was designed using paper prototyping. To enable navigation in busy city, a sequence of three basic kinds of screens was devised, corresponding to city, neighborhood, and actual location. We found sound works better than images, but are still searching for a good solution for hyperlinks.
Moody mobile TV: adding emotion to personalized playlists BIBAFull-Text 623-628
  Arne Berger; Robert Knauf; Maximilian Eibl; Aaron Marcus
An interface for filtering large video repositories for generating personalized playlists via navigation and selection of moods and emotion on a mobile device.
Interactivity to enhance perception: does increased interactivity in mobile visual presentation tools facilitate more accurate rating of textile properties? BIBAFull-Text 629-634
  Pawel M. Orzechowski; Douglas Atkinson; Stefano Padilla; Thomas S. Methven; Sharon Baurley; Mike Chantler
As part of the EPSRC funded 'Digital Sensoria' project a set of digital tools were utilised to better demonstrate the tactile qualities of textiles via the internet. Shoogleit [8], an online utility for the creation of interactive video was one such tool. A Shoogle player for iOS mobile devices was then created (Orzechowski) using an iterative process, during which experiments were carried out to determine if the added interactivity afforded by Shoogleit could more accurately describe textile qualities and thus aid in the creation of a next generation mobile browser for textiles.
Prototyping input controller for touch-less interaction with ubiquitous environments BIBAFull-Text 635-640
  Kamer Ali Yuksel; Serdar Hasan Adali
In ubiquitous computing environments, the information processing is integrated into everyday objects that are ideally small, inexpensive and wirelessly networked devices. Contemporary human-computer interaction models are not adequate to control miniaturized devices, which are distributed throughout everyday life and activities. This post-desktop model requires natural gesture-based interaction with distributed devices in an egocentric manner as opposed to the current device-centric interaction. In this work, we have utilized the recently proposed touch-less gesture-based interaction method based on magnetic field to provide a hardware basis for a wearable input controller. Furthermore, we have discussed how the proposed device can allow natural interaction with other devices within a ubiquitous computing environment such as personal area network.
A snapshot diary to support conversational storytelling for persons with aphasia BIBAFull-Text 641-646
  Maarten Woudstra; Abdullah Al Mahmud; Jean-Bernard Martens
In this paper we present the design of an application that supports conversational storytelling for people with aphasia. Camelendar is a touch screen application that supports people with aphasia in expressive storytelling. Camelendar utilizes photos to support people with aphasia talking about their daily activities; participants of the conversation (including the aphasic person) add comments and detailed information to the photos, enriching the story each time it is told. By arranging photos visually on a calendar people with aphasia can navigate to photos of events they want to talk about. The photos of the events transform into stories by adding tags and comments directly on the photos.
M-Urgency: a next generation, context-aware public safety application. BIBAFull-Text 647-652
  Shivsubramani Krishnamoorthy; Ashok Agrawala
We present a public safety system that (1) redefines how emergency calls are presently made (2) has been designed to be fully context aware. First, we describe Rover 2.0, a context aware framework which caters to the development of mobile applications. The framework uses a paradigm for handling context information, which could be user specific context, combined with the general context of the system and the environment, to provide relevant functions and support to the users through the applications. More importantly, we present M-Urgency, a public safety application, developed on the Rover 2.0 platform that redefines the way an emergency call is made to the PSAP (Public Safety Answering Point). M-Urgency enables mobile users to stream live video from their devices to local PSAP along with the audio stream, the real time location information and any personal relevant information about the caller. We are in the process of deploying the M-Urgency system at the University of Maryland Police Department for a pilot.
Open-source platform: exploring the opportunities for offline mobile learning BIBAFull-Text 653-658
  Sujan Shrestha; John Moore; José Abdelnour Nocera
The mobile technology field is rapidly expanding and the focus on how it can be incorporated to support learning is also growing. However, the barriers to inclusion of information communication technologies in the public schools of Nepal are still significant and the widespread access to digital content remains a key obstacle. Nepal has a poor communication infrastructure and where available, telecommunication and electricity are poorly maintained or too costly to use. The aim of this exploratory research study is to highlight how an offline mobile learning solution may address some of the technical challenges to support one of the current and most urgent requirements to provide an access to digital content. It will investigate the deployment of previously unexplored low-spec sub US $100 open-source mobile devices to facilitate English language learning and address the knowledge requirements of teachers in government funded public schools of Nepal.


Feeling the next track: designing mobile music player previews BIBAFull-Text 659-662
  David Beattie; Lynne Baillie; Lee Morton
We present a novel method of interaction for users to preview an audio track haptically. The preview enables users to "feel" the track they want to select, thus saving them from having to look at the screen or listen to the track before actually playing it. Our results show that users enjoyed the combination of audio and haptic feedback and that users would very much like to see this type of sensory collaboration being incorporated into their own mobile device.
Utilizing sensor fusion in markerless mobile augmented reality BIBAFull-Text 663-666
  Klen Copic Pucihar; Paul Coulton; Daniel Hutchinson
One of the key challenges of markerless Augmented Reality (AR) systems, where no a priori information of the environment is available, is map and scale initialization. In such systems, the scale is unknown as it is impossible to determine the scale from a sequence of images alone. Implementing scale is vital for ensuring that augmented objects are contextually sensitive to the environment they are projected upon. In this paper we demonstrate a sensor and vision fusion approach for robust and user-friendly initialization of map and scale. The map is initialized, using inbuilt accelerometers, whilst scale is initialized by the camera auto-focusing capability. The later is possible by applying the Depth From Focus (DFF) method, which was, till now, limited to high precision camera systems. The demonstrated illustrates benefits of such a system, which is running on a commercially available mobile phone Nokia N900.
Decreasing media breaks through content sharing in wireless networks with mobile devices BIBAFull-Text 667-670
  Anton Fedosov; Jeffrey Blattman; Jorgen Birkler
This paper presents a unique media content sharing ecosystem based on seamlessly established communication sessions between a device such as cellular phone and a remote web-enabled device with a larger display.
Like the real world: online aesthetics and habits transferred to the physical space BIBAFull-Text 671-674
  Irene Posch; Jona Hoier
We introduce e-accessories using established web aesthetics to transfer online interactions into the physical world and vice versa. Drawing on visual impressions and interaction metaphors of web 2.0 applications we propose two scenarios to include relevant functions into the physical space and allow real world actions to directly interact with the personal online presence.
   We describe our design approach for "LikeButton" and "LinkPin", reflecting on relevant technological, aesthetical and interaction related issues and implications.
Free All Monsters!: a context-aware location based game BIBAFull-Text 675-678
  Kate Lund; Paul Coulton; Andrew Wilson
Free All Monsters! is a novel location based mobile game which incorporates user generated content in an attempt to broaden its appeal by encouraging creativity. An online portal allows participants to create content which is then used to populate the game. The game was recently launched on the iPhone App Store and is aimed and designed to be a family orientated activity.
Multimodal multi-device program guide for smart conferences BIBAFull-Text 679-682
  Markku Turunen; Juho Hella; Toni Miettinen; Pellervo Valkama; Jaakko Hakulinen; Roope Raisamo
We demonstrate a multimodal, multi-user, and multi-device conference program guide for conference participants. Its functionality includes access to the conference program with additional multimedia content, voting, feedback, and communication with the other participants. People can interact with the system in a multimodal way using spoken language, gestures and haptic feedback with mobile phones and shared public displays.
Blicko social music places BIBAFull-Text 683-687
  Jesper Ahlberg; Andreas Andrén; Theodor Zettersten
Blicko is a social music service that lets the listeners control the playback together. The service allows the users to cooperatively build up a playlist of music, and vote on the order of the tracks being played. It can be thought of as a modern jukebox with real-time collaborative features. The purpose is to enable attendees of venues, public places (i.e. cafes, bars, restaurants, conferences) or parties to see and influence the music that is being played. The interaction is available through an intuitive web interface that the user can access through any type of device with a web-browser (i.e. smartphones, tablets and laptops etc.).
Real-time object recognition using mobile devices BIBAFull-Text 687-690
  Ana Lameira; Rui Jesus; Nuno Correia
This paper proposes an application for real-time object recognition using mobile devices that runs locally without the need to communicate with a server. Both object detection and identification algorithms are performed by the mobile device avoiding the communication with the server. The paper presents results that show the effectiveness of the proposal.

Design competition

Mobile thumb interaction and speech BIBAFull-Text 691-694
  Mikhail Blinov; Matthieu Deru; Daniel Sonntag
The design of future spoken and multimodal interfaces should not only compare voice with other modalities of interaction. Instead, it should take screen-based smartphones as a basis and add new, gesture-based anthropocentric interaction forms to it. A potential speech-based interaction can then be smoothly integrated. We focus on the thumb's role while carrying a handheld device in the left or right hand.
First contact: encouraging use of emergency contact details on mobile phones BIBAFull-Text 695-697
  John Greaney
Emergency services often wish to retrieve personal information about an unconscious person. There is currently no internationally agreed way to input and identify a next-of-kin (or other emergency contact) in a mobile phone's directory. The paper makes a contribution through identifying some of the barriers to this simple behaviour (why do most people not have this information stored on their mobile?), as well as suggesting how this behaviour could become more prevalent. In particular, the practice of prefixing the designated contact with 01 is advocated ('First Contact'). As such, this represents a simple solution that would have enormous value to a person in the event it was needed. Although thankfully rare, such communication could clearly be called "essential".
MAWL: mobile assisted word-learning BIBAFull-Text 699-701
  Pramod Verma
In this paper we describe Mobile Assisted Word-Learning (MAWL): An augmented reality based collaborative interface for learning new words using a smartphone.
Spaces Without Faces BIBAFull-Text 703-705
  Dan Kestranek; Russell J. Clark; Matt Sanders; Erika S. Poole; Philip Marquardt; Jackson Rabun; Joseph Rhodes
In this paper, we present an entry for the 2011 MobileHCI design competition. Our aim is to define the "essence" of mobile communication and connectivity, and illustrate it via a design example called Spaces Without Faces, an application to locate quiet study spaces on a university campus. We argue that the essence of mobile communication and connectivity, no matter the form factor of the phone, comes not from the ability to make text and voice calls, but rather to use the phone as a device for obtaining information about resources in nearby physical environments.
BrailleTouch: designing a mobile eyes-free soft keyboard BIBAFull-Text 707-709
  Mario Romero; Brian Frey; Caleb Southern; Gregory D. Abowd
Texting is the essence of mobile communication and connectivity, as evidenced by today's teenagers, tomorrow's workforce. Fifty-four percent of American teens contact each other daily by texting, as compared to face-to-face (33%) and talking on the phone (30%) according to the Pew Research Center's Internet & American Life Project, 2010. Arguably, today's technologies support mobile text input poorly, primarily due to the size constraints of mobile devices. This is the case for everyone, but it is particularly relevant to the visually impaired. According to the World Health Organization, 284 million people are visually impaired worldwide. In order to connect these users to the global mobile community, we need to design effective and efficient methods for eyes-free text input on mobile devices. Furthermore, everyone would benefit from effective mobile texting for safety and speed. This design brief presents BrailleTouch, our working prototype solution for eyes-free mobile text input.
Designing mobile phones using silent speech input and auditory feedback BIBAFull-Text 711-713
  Kamer Ali Yuksel; Sinan Buyukbas; Serdar Hasan Adali
In this work, we have propose a novel design for a basic mobile phone, which is focused on the essence of mobile communication and connectivity, based on a silent speech interface and auditory feedback. This assistive interface takes the advantages of voice control systems while discarding its disadvantages such as the background noise, privacy and social acceptance. The proposed device utilizes low-cost and commercially available hardware components. Thus, it would be affordable and accessible by majority of users including disabled, elderly and illiterate people.


We need to talk: rediscovering audio for universal access BIBAFull-Text 715-716
  Stephen Brewster; Matt Jones; Roderick Murray-Smith; A. A. Nanavati; N. Rajput; Albrecht Schmidt; M. Turunen
"In all the wonderful worlds that writing opens, the spoken word still resides and lives. Written texts all have to be related somehow, directly or indirectly, to the world of sound, the natural habitat of language, to yield their meanings."
   Only 22% of the human population accesses the Internet. The larger fraction of the world cannot read or write. Worldwide, 284 million people are visually impaired. And yet, there are 5.3 billion mobile subscribers, and their numbers are increasing.
   Much of the mobile work by HCI researchers explores a future world populated by high-end devices and relatively affluent users. This panel turns to consider the hundreds of millions of people for whom such sophistication will not be realised for many years to come. How should we design interfaces and services that are relevant and beneficial for them?
Time to revisit mobility in mobile HCI? BIBAFull-Text 717-719
  Alexandra Weilenmann; Oskar Juhlin
In this panel, we discuss the relevance of the concept of mobility in current mobile Human-Computer Interaction research. Is the term still useful to understand and design for interaction with computers, or has the concept of mobility run dry and void of meaning?


Mobile family interaction: how to use mobile technology to bring trust, safety and wellbeing into families BIBAFull-Text 721-724
  Jofish Kaye; Matti Nelimarkka; Riitta Kauppinen; Sini Vartiainen; Pekka Isosomppi
Mobile devices have become an important part of adult life, where they are now used for media consumption and creation instead of traditional telephony function only. The same trend is visible in teens and children, both of whom are using mobile devices increasingly.
   Naturally the extensive use of technology easily brings up new threats, especially as mobile devices are mobile, and they cannot be tied to fixed locations that could be monitored. Nevertheless, the technology may also be beneficial for communication and family life when used smartly. These positive interaction possibilities need to be highlighted, as they provide new space for innovation.
   Developing our understanding further in this matter requires a multidisciplinary approach. Hence we ask for contributions from several fields, such as ethnography, education and design studies. This way we can enhance dialogue and co-create new solutions together.
IWS: interacting with sound: a workshop exploring context-aware, local and social audio applications BIBAFull-Text 725-728
  Thomas Sandholm; April Mitchell; Alex Vorbau; Elsa Kosmack Vaara; Jonas Söderberg
In this workshop, we explore novel applications, services, tools, and systems that take advantage of the audio channel on mobile devices to feed users with a flow of information. We call for innovative ideas to introduce ambient context-aware, location-aware, and/or social audio as a more effective means of communicating information and providing experiences to mobile users.
Workshop on mobile interaction in retail environments (MIRE) BIBAFull-Text 729-731
  Sven Gehring; Markus Löchtefeld; Carsten Magerkurth; Petteri Nurmi; Florian Michahelles
The workshop on mobile interaction in retail environments (MIRE) brings together researchers and practitioners from academy and industry to explore how mobile phones and mobile interaction can be embedded in retail environments to create new shopping experiences and mobile enhanced services.
SiMPE: 6th Workshop on Speech in Mobile and Pervasive Environments BIBAFull-Text 733-735
  A. A. Nanavati; N. Rajput; A. I. Rudnicky; M. Turunen; A. I. Kun; T. Paek; I. Tashev
With the proliferation of pervasive devices and the increase in their processing capabilities, client-side speech processing has been emerging as a viable alternative. The SiMPE workshop series started in 2006 [5] with the goal of enabling speech processing on mobile and embedded devices to meet the challenges of pervasive environments (such as noise) and leveraging the context they offer (such as location). SiMPE 2010, the latest in the series brought together, very successfully, researchers from the speech and the HCI communities. We believe this is the beginning.
   SiMPE 2011, the 6th in the series, will continue to explore issues, possibilities, and approaches for enabling speech processing as well as convenient and effective speech and multimodal user interfaces. Over the years, SiMPE has been evolving too, and since last year, one of our major goals has been to increase the participation of speech/multimodal HCI designers, and increase their interactions with speech processing experts.
   Multimodality got more attention in SiMPE 2008 than it has received in the previous years. In SiMPE 2007 [4], the focus was on developing regions. Given the importance of speech in developing regions, SiMPE 2008 had "SiMPE for developing regions" as a topic of interest. Speech User interaction in cars was a focus area in 2009 [2].
Mobile work efficiency: enhancing workflows with mobile devices BIBAFull-Text 737-740
  Alexander Meschtscherjakov; Christiane Moser; Manfred Tscheligi; Erika Reponen
This workshop is a forum of multi-disciplinary discussion on how mobile devices can increase perceived work efficiency (PWE), as well as how this subjective enhancement can be measured. It brings together practitioners and researchers from different domains interested in researching perceived workflow efficiency in the mobile context. The overall aim is to create a common base, as well as further extend the research agenda for work efficiency enhancement with the assistance of mobile devices both from a scientific, as well as from an industrial perspective.
Exploring Design Methods for Mobile Learning BIBAFull-Text 741-743
  Chiara Rossitto; Daniel Spikol; Teresa Cerratto-Pargman; Leif M. Hokstad
This paper introduces the workshop "Exploring Design Methods for Mobile Learning" to be held at MobileHCI 2011, in Stockholm, Sweden.
Please enjoy!?: 2nd workshop on playful experiences in mobile HCI BIBAFull-Text 745-748
  Ylva Fernaeus; Jussi Holopainen; Tilde Bekker
This workshop, following one with the same name held at Mobile HCI 2010, aims at further exploring different approaches and challenges in studying playfulness as a mode of interacting with mobile technology.
Mobile augmented reality: design issues and opportunities BIBAFull-Text 749-752
  Marco de Sa; Elizabeth F. Churchill; Katherine Isbister
With the rapid evolution of mobile devices, smart-phones in particular, comes the ability to create new experiences that enhance the way we see, interact, and express ourselves, within the world that surrounds us. We can blend data from our senses and our devices in myriad ways that simply weren't possible before. This workshop explores the current and future state of Mobile Augmented Reality. We will promote discussion about issues and opportunities in the space. We will explore potential for innovation and opportunities for collaboration between researchers working on augmented reality. We envision a lively discussion on the different approaches, challenges and benefits that may arise from the use of mobile Augmented Reality in the near future from an HCI perspective. We also aim at fostering new collaborations and establishing a research agenda within the field of mobile augmented reality.
Internet of things marries social media BIBAFull-Text 753-755
  Joakim Formo; Jarmo Laaksolahti; Marcus Gårdman
What happens when non-human objects enter social media and start imitating social relations with people? Starting from a social networking stance towards connected objects this workshop looks at the challenges of designing for the dualism in objects consisting of a physical thing and a digital representation online, and how we could develop user interaction models that are understandable, liberates from the screens and move out in the physical world.
Body, movement, gesture & tactility in interaction with mobile devices BIBAFull-Text 757-759
  Sven Kratz; Michael Rohs; Katrin Wolf; Jörg Müller; Mathias Wilhelm; Carolina Johansson; Jakob Tholander; Jarmo Laaksolahti
In the search for novel and more expressive interaction techniques for mobile devices, bodily aspects such as movement, gesture, and touch based interfaces are prominent. For instance, touch-screen gestures have found widespread application in mobile device interfaces while bodily gestures involving device movement are successfully applied in gaming scenarios. Research systems increasingly explore other modalities, like pressure, free-hand and on body interaction in mobile settings. This has become possible through on-going developments that have made sensing and actuating technologies cheaper and more easily integrated in mobile and handheld devices. The turn towards experiential, embodied, and enacted perspectives on cognition and action has also contributed to a shift in what aspects of interaction to focus upon in interaction design. This has led HCIresearchers to explore not only how the whole human body can be taken into account in design, but also to explore new domains of application for instance in leisure, entertainment and public urban environments.
Mobile wellness: collecting, visualizing and interacting with personal health data BIBAFull-Text 761-763
  Konrad Tollmar; Frank Bentley; John Moore; Alex Olwal
Mobile devices are now able to connect to a variety of sensors and provide personalized information to help people reflect on and improve their health. For example, pedometers, heart-rate sensors, glucometers, and other sensors can all provide real-time data to a variety of devices. Collecting and interacting with personal health or well-being data is a growing research area. This workshop will focus on the ways in which our mobile devices can aggregate and visualize these types of data and how these data streams can be presented to encourage interaction, increased awareness and positive behavior change.
Designing and evaluating mobile systems for collocated group use BIBAFull-Text 765-768
  Nirmal J. Patel; James Clawson
With the proliferation of mobile devices it has become common to see groups of users working or playing together using multiple mobile devices. While much effort is exerted to ensure that interaction with a mobile device is useful for each individual user, less effort has gone into considering how to design and evaluate mobile interfaces and platforms for group use. Recent improvements in the interaction, computing, connectivity and general flexibility of mobile devices make them an ideal, yet underutilized, platform for group level interaction. Our goal with this workshop is to bring together researchers who have started to investigate the collocated group use of mobile devices and to shed light on the challenges of designing and evaluating mobile collocated group experiences.