HCI Bibliography Home | HCI Conferences | EICS Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
EICS Tables of Contents: 09101112131415

ACM SIGCHI 2010 Symposium on Engineering Interactive Computing Systems

Fullname:Proceedings of the 2nd ACM SIGCHI Symposium on Engineering Interactive Computing Systems
Editors:Noi Sukaviriya; Jean Vanderdonckt; Michael Harrison
Location:Berlin, Germany
Dates:2010-Jun-19 to 2010-Jun-23
Standard No:ISBN: 1-4503-0083-9, 978-1-4503-0083-4; ACM DL: Table of Contents hcibib: EICS10
Papers:60
Pages:378
Links:Conference Home Page
  1. Keynote
  2. Supporting context and inference
  3. Tool support for interface development
  4. Specifying interactive systems
  5. Modeling for analysis of interactive systems
  6. Interaction techniques and technologies
  7. KeynoteInteraction techniques and technologies
  8. Collaboration, business and web orchestration
  9. Posters
  10. Demonstrations
  11. Doctoral consortium
  12. Workshops
  13. Tutorials

Keynote

User interface plasticity: model driven engineering to the limit! BIBAKFull-Text 1-8
  Joëlle Coutaz
Ten years ago, I introduced the notion of user interface plasticity to denote the capacity of user interfaces to adapt, or to be adapted, to the context of use while preserving usability. The Model Driven Engineering (MDE) approach, which was used for user interface generation since the early eighties in HCI, has recently been revived to address this complex problem. Although MDE has resulted in interesting and convincing results for conventional WIMP user interfaces, it has not fully demonstrated its theoretical promises yet. In this paper, we discuss how to push MDE to the limit in order to reconcile high-level modeling techniques with low-level programming in order to go beyond WIMP user interfaces.
Keywords: dynamic service composition, model driven engineering (mde), run time adaptation, service-oriented architecture (SOA), user interface adaptation, user interface composition, user interface generation, user interface plasticity

Supporting context and inference

Bridging models and systems at runtime to build adaptive user interfaces BIBAKFull-Text 9-18
  Marco Blumendorf; Grzegorz Lehmann; Sahin Albayrak
Adapting applications and user interfaces at runtime requires a deeper understanding of the underlying design. Models formalize this design, express the underlying concepts and make them accessible to machines. In our work we utilize runtime models to reflect the state of the interactive system (its UI respectively) and to change its underlying configuration. So called executable models combine design information, runtime state, and execution logic. From the perspective of adaptive UIs this allows the dynamic reconfiguration of UIs according to design information and the current state of the application at runtime. Dedicated elements of the model create a causal interconnection between model and user interface and facilitate a continuous information exchange between the two. This creates a feedback loop between model and UI where external stimulations influence the model execution and where projections to the outside allow the dynamic alteration of user interfaces.
Keywords: adaptive user interfaces, executable models, model-based user interface development, model-driven engineering
Software refactoring process for adaptive user-interface composition BIBAKFull-Text 19-28
  Anthony Savidis; Constantine Stephanidis
Adaptive user-interface composition is the ability of a software system to: (a) compose its user-interface at runtime according to a given deployment profile; and (b) to possibly drop running components and activate better alternatives in their place in response to deployment profile modifications. While adaptive behavior has gained interest for a wide range of software products and services, its support is very demanding requiring adoption of user-interface architectural patterns from the early software design stages. While previous research addressed the issue of engineering adaptive systems from scratch, there is an important methodological gap since we lack processes to reform existing non-adaptive systems towards adaptive behavior. We present a stepwise transformation process of user-interface software by incrementally upgrading relevant class structures towards adaptive composition by treating adaptive behavior as a cross-cutting concern. All our refactoring examples have emerged from real practice.
Keywords: adaptive user interfaces, software engineering, software process, source code refactoring
How assessing plasticity design choices can improve UI quality: a case study BIBAKFull-Text 29-34
  Audrey Serna; Gaëlle Calvary; Dominique L. Scapin
In Human Computer Interaction, plasticity refers to the capacity of User Interfaces (UIs) to withstand variations of context of use while preserving quality in use. Frequently, insuring more or less smooth transition from one context of use to the other (from the end-user perspective) is conducted ad hoc. To support a more systematic approach for characterizing UI tuning in terms of quality in use along context of use variations, we present an exploratory study focused deliberately on platform aspects. The design process of this particular case study is detailed and all design decisions have been recorded in terms of their influence on UI ergonomic quality, using Ergonomic Criteria. The interesting result is that most design choices when changing the platform lead to the reexamination of the initial designs. Ongoing work is done to support the insight that considering plasticity seems to help in explicitly broadening UI design choices and sharpening the solution.
Keywords: plasticity, quality in use, ui design
Using ensembles of decision trees to automate repetitive tasks in web applications BIBAKFull-Text 35-40
  Zachary Bray; Per Ola Kristensson
Web applications such as web-based email, spreadsheets and form filling applications have become ubiquitous. However, many of the tasks that users try to accomplish with such web applications are highly repetitive. In this paper we present the design of a system we have developed that learns and thereafter automates users' repetitive tasks in web applications. Our system infers users' intentions using an ensemble of decision trees. This enables it to handle branching, generalization and recurrent changes of relative and absolute positions. Our evaluation shows that our system converges to the correct solution after 3-8 iterations when the pattern is noise-free, and after 3-14 iterations for a noise level between 5-35%.
Keywords: end-user programming, programming by example
Xplain: an editor for building self-explanatory user interfaces by model-driven engineering BIBAKFull-Text 41-46
  Alfonso García Frey; Gaëlle Calvary; Sophie Dupuy-Chesa
Modern User Interfaces (UI) must deal with the increasing complexity of applications in terms of functionality as well as new properties as plasticity. The plasticity of a UI denotes its capacity of adaptation to the context of use while preserving its quality. The efforts in plasticity have focused on the (meta) modeling of the UI, but the quality remains uncovered. This paper describes an on-going research that studies a method to develop Self-Explanatory User Interfaces as well as an editor that implements this method. Self-explanation makes reference to the capacity of a UI to provide the end-user with information about its rationale (which is the purpose of the UI), its design rationale (why is the UI structured into this set of workspaces?, what's the purpose of this button?), its current state (why is the menu disabled?) as well as the evolution of the state (how can I enable this feature?). Explanations are provided by embedded models.
Keywords: design rationale, help, model transformation, model-driven engineering, self-explanatory user interfaces, ui quality

Tool support for interface development

Increasing the automation of a toolkit without reducing its abstraction and user-interface flexibility BIBAKFull-Text 47-56
  Prasun Dewan
The apparent tradeoff between user-interface automation and abstraction and user-interface flexibility can be overcome using two key ideas. (1) It is possible to automate several common aspects of a user-interface without controlling its appearance. (2) By following well established programming principles, developers can provide user-interface tools with information needed for such automation. These ideas are used in a new approach that assumes that programmers (a) encapsulate the semantics of interactive applications in model objects, (b) use consistent ways to relate signatures of related methods, (c) define method preconditions, and (d) use annotations for documentation. It uses these principles to automate (a) binding of input events to synchronous and asynchronous invocation of model methods, (b) syntactic and semantic validation of user input, (c) binding of model state to display state, (c) undo/redo, and (d) dynamic enabling/disabling of display components. The result is an approach for increasing the automation of UI toolkits without reducing their abstraction and user-interface flexibility.
Keywords: user interface tools, user interface tools, MVC, preconditions, undo, redo
User interface design by sketching: a complexity analysis of widget representations BIBAKFull-Text 57-66
  Suzanne Kieffer; Adrien Coyette; Jean Vanderdonckt
User interface design by sketching, as well as other sketching activities, typically involves sketching objects through representations that should combine meaningfulness for the end users and easiness for the recognition engines. To investigate this relationship, a multi-platform user interface design tool has been developed that enables designers to sketch design ideas in multiple levels of fidelity with multi-stroke gestures supporting widget representations and commands. A usability analysis of these activities, as they are submitted to a recognition engine, suggests that the level of fidelity, the amount of constraints imposed on the representations, and the visual difference of representations positively impact the sketching activity as a whole. Implications for further sketch representations in user interface design and beyond are provided based on usability guidelines.
Keywords: level of fidelity, shape recognition, sketching, user interface design, user interface prototyping
An automated routine for menu structure optimization BIBAKFull-Text 67-76
  Mikhail V. Goubko; Alexander I. Danilenko
We propose an automated routine for hierarchical menu structure optimization. A computer advice-giving system founded on the mathematical model of menu navigation directs the designer-driven process of sequential enhancement, while menu designer caters for the semantic quality of menu labels and groupings used. The mathematical model employs the frequencies of functions' usage and the estimates of navigation time delays to calculate the average search time for the current and optimal menu structure, to show the "bottleneck" panels of the current menu, and to suggest the direction of their improvement. The model covers the variety of menu types and allows choosing the best type to meet requirements of specific application or user category. The approach is illustrated by the optimization of a mobile phone command menu.
Keywords: depth vs. breadth, menu design automation, menu of mobile device, menu-driven system, optimal hierarchy, usability
Sketched menu: a tabletop-menu technique for GUI object creation BIBAKFull-Text 77-86
  Mohammed Belatar; François Coldefy
In this paper, we describe the Sketched Menu, a menu technique for launching Graphical User Interface (GUI) objects in a tabletop interface. The Sketched Menu enables the user to define interactively the shape, size, orientation, and location of a new GUI object before it is launched. These parameters are specified implicitly by the user via a shape drawing. The shape corresponds to a simplified outline of the object. A menu for the given shape is launched at the exact location of the drawing and the user can then select the desired graphical object or application. A laboratory experiment was conducted to compare this new menu technique with the traditional pop-up-menu approach. The results show that the time to create an object and the positioning accuracy are similar for the two techniques. There are two advantages to using the Sketched Menu. Firstly, graphical discontinuities are avoided when adding objects to the interface. Secondly, because the object appears exactly in the desired location and orientation, it does not hide other graphical objects, thereby reducing disruption to other users' tasks.
Keywords: collocated collaboration, cscw, gui orientation, interaction technique, menu technique, tabletop display
Magellan, an evolutionary system to foster user interface design creativity BIBAKFull-Text 87-92
  Dimitri Masson; Alexandre Demeure; Gaelle Calvary
Fostering creativity in User Interface (UI) design is challenging for innovation. This paper explores the combination of model-based approaches and interactive genetic algorithms to foster the exploration of the design space. A user task model is given in input. Magellan produces sketches of UIs that aim at inspiring the designer. Later on, appropriate tools may be used to tune the right design into the design right. Magellan is a proof of concept that deserves further exploration. Currently it is implemented using COMETs but it is not dependent of this technology.
Keywords: creativity, interactive genetic algorithm, Magellan, model-based user interface design
MoPeDT: features and evaluation of a user-centred prototyping tool BIBAKFull-Text 93-102
  Karin Leichtenstern; Elisabeth André
User-Centred Prototyping (UCP) tools are expected to support interface developers in order to more efficiently, effectively and satisfactorily design, evaluate and analyse user-friendly products by an all-in-one tool solution. We developed such a UCP tool called MoPeDT that supports the user-centred development of interactive evolutionary prototypes for mobile phones in the context of the Internet of Things. In this paper we address our tool features for the design, evaluation and analysis phase that mostly base on meaningful features from related tools. Additionally, we cover potential enhancements to former UCP tools in order to support interface developers with a wide-ranging playground to investigate the user's behaviour and preferences. The paper also describes MoPeDT's evaluation with 20 students over one month that investigated the interface developer's efficiency, effectiveness and satisfaction when applying our UCP tool. As an outcome of this study we describe potential benefits and problems that might be of interest to other developers of UCP tools.
Keywords: evolutionary prototypes, internet of things, mobile phones, pervasive interface, user-centred prototyping tool
Digisketch: taming Anoto technology on LCDs BIBAKFull-Text 103-108
  Ramon Linus Hofer; Andreas Kunz
The Anoto technology uses a non-repetitive pattern printed on paper to enable a camera-equipped pen to locate its absolute position on that pattern. This technology is also used on projection screens to create large-sized interactive areas, but suffers from the drawbacks such as shadow casting or space requirements. Up to now, no implementation exists that enables a tracking on LC-displays using the Anoto technology. Thus, we introduce Digisketch, which uses special films that can be applied to LC-displays, to back and front projections, or to glass, allowing pattern recognition for the pen's camera. After describing the technical development of a prototype, we compare this new possibility of using Anoto compatible surfaces with other traditional tracking systems for LC-screens.
Keywords: Anoto, LCD, optical tracking, pattern recognition, pen tracking, user study
WebWOZ: a wizard of oz prototyping framework BIBAKFull-Text 109-114
  Stephan Schlögl; Gavin Doherty; Nikiforos Karamanis; Saturnino Luz
Language Technology (LT) based applications become more popular as technology improves. Prototyping early in the design process is critical for the development of high quality applications. It is difficult, however, to do low-fidelity prototyping (e.g. paper prototyping) of applications based on LT. One technique that has been used for this kind of prototyping is Wizard of Oz (WOZ). However, this generally involves the development of one-off user and wizard interfaces. A tool that facilitates the flexible integration of LT components into WOZ experiments is desirable. In this paper we explore the requirements for such a tool, drawing from the literature and a first WOZ experiment in which different wizards were observed and their behaviour was analysed.
Keywords: language technology, prototyping, wizard of oz

Specifying interactive systems

Improving modularity and usability of interactive systems with Malai BIBAKFull-Text 115-124
  Arnaud Blouin; Olivier Beaudoux
In this paper we present Malai, a model-based user interface development environment. Malai is dedicated to the conception of post-WIMP (Window, Icon, Menu, Pointing device) interactive systems. Malai aims at gathering together principles from Norman's action model, instrumental interaction, direct manipulation, the interactor concept and the DPI model (Documents, Presentations, Instruments). It completes works on data manipulation techniques used to link source data to user interfaces. We show how Malai can improve modularity and usability of interactive systems by considering actions, interactions and instruments as reusable first-class objects. Malai has been successfully used for the development of several post-WIMP interactive systems. We introduce each Malai component using the same example: a vector graphics editor.
Keywords: action, instrument, interaction, interactive system, MDE, user interface
COMM notation for specifying collaborative and multimodal interactive systems BIBAKFull-Text 125-134
  Frédéric Jourde; Yann Laurillau; Laurence Nigay
Multi-user multimodal interactive systems involve multiple users that can use multiple interaction modalities. Although multi-user multimodal systems are becoming more prevalent (especially multimodal systems involving multitouch surfaces), their design is still ad-hoc without properly keeping track of the design process. Addressing this issue of lack of design tools for multi-user multimodal systems, we present the COMM (Collaborative and MultiModal) notation and its on-line editor for specifying multi-user multimodal interactive systems. Extending the CTT notation, the salient features of the COMM notation include the concepts of interactive role and modal task as well as a refinement of the temporal operators applied to tasks using the Allen relationships. A multimodal military command post for the control of unmanned aerial vehicles (UAV) by two operators is used to illustrate the discussion.
Keywords: groupware, multimodal interaction, specification notation
Representations for an iterative resource-based design approach BIBAKFull-Text 135-144
  Anke Dittmar; Michael D. Harrison
This paper describes how the HOPS notation can be used to support Human Centered Design. It discusses the role of the notation in providing multiple design viewpoints. It demonstrates how the HOPS tool can be used to animate these viewpoints. Finally, HOPS is used to specify how the system provides information resources for user action. This approach to specifying plausible user behavior is contrasted with a task based approach. The HOPS based design techniques are illustrated through a process control example.
Keywords: human centered design, task representation, viewpoints

Modeling for analysis of interactive systems

User interface model discovery: towards a generic approach BIBAKFull-Text 145-154
  Andy Gimblett; Harold Thimbleby
UI model discovery is a lightweight formal method in which a model of an interactive system is automatically discovered by exploring the system's state space, simulating the actions of a user; such models are then amenable to automatic analysis targetting structural usability concerns. This paper specifies UI model discovery in some detail, providing a formal, generic and language-neutral API and discovery algorithm. The technique has been implemented in prototype systems on several programming platforms, yielding valuable usability insights. The API described here supports further development of these ideas in a systematic manner.
Keywords: discovery tools, interaction programming, reverse engineering, structural usability
Taxonomy proposal for the description of accidents and incidents in the electrical systems operation BIBAKFull-Text 155-164
  Daniel Scherer; Raffael C. da Costa; Joálison G. Barbosa; Maria de Fátima Q. Vieira
Documenting accidents and incidents in a clear, precise and unambiguous way is essential to ground the error study and to propose preventive measures. Research in the field of Human Error has yielded many taxonomies and categorizations of errors which are used by companies when analyzing and reporting on accidents and incidents. However, there are difficulties regarding the appropriateness of the terms as applied to a company's specific context, since taxonomies tend to be specific. Moreover, the lack of a clear organization and the adoption of concepts which are not universal among users with different background render their taxonomy's use rather difficult. This paper proposes a taxonomy and an associated supporting tool for describing accidents and incidents in the operation of electrical systems. The proposed taxonomy addresses the concepts and the terms related to the causes of error as well as concepts related to the user cognitive processes that led to the error.
Keywords: human error, taxonomy, user's activities
Beyond modelling: an integrated environment supporting co-execution of tasks and systems models BIBAKFull-Text 165-174
  Eric Barboni; Jean-François Ladry; David Navarre; Philippe Palanque; Marco Winckler
This paper focuses on the articulations of task models and system models. Tasks models are meant to be used by human factor specialists whilst system models are supposed to be produced by software engineers. However, tasks models and systems models represent two different views on how users interacting with a computing system to reach a goal. This paper presents an integration framework aiming to take full advantage of task models and system models that have been developed initially in a separated manner and how these two views can be integrated at the model level and additionally at the tool level. The main contribution of the paper lies in the definition of such integration at the tool level to be used at runtime (while the user is operating the system). Indeed, thanks to this integration contextual help can be offered to the users supporting the construction of the mental bridge between what they have to do (defined in the tasks model) and what the interactive system allows (defined in the system model). The approach, the tools and the integration are presented on a case study of a Weather Radar System (WXR) embedded in aircraft cockpits.
Keywords: models integration, task and systems models, tool support
Developing usability studies via formal models of UIs BIBAKFull-Text 175-180
  Judy Bowen; Steve Reeves
Developing usability studies to evaluate software is a task requiring a wide variety of skills. For software developers who are not used to taking a user-centred approach to development it is often easier and more convenient to dismiss the use of user evaluation as too time-consuming or too hard. This is even more likely to be the case for developers who take a formal approach to software development, which is generally not focused on interface or usability concerns. In this paper we present an early investigation into the use of formal models of user interface designs as the basis for designing software evaluation studies. We have undertaken a comparison study to find out whether a useful study can be derived in this way and whether or not further investigation into this is worthwhile, and we present the results here.
Keywords: formal methods, software evaluation, ui design, ui models, usability studies
The GUISurfer tool: towards a language independent approach to reverse engineering GUI code BIBAKFull-Text 181-186
  João Carlos Silva; Carlos C. Silva; Rui D. Gonçalo; João Saraiva; José Creissac Campos
Graphical user interfaces (GUIs) are critical components of today's software. Developers are dedicating a larger portion of code to implementing them. Given their increased importance, correctness of GUIs code is becoming essential. This paper describes the latest results in the development of GUISurfer, a tool to reverse engineer the GUI layer of interactive computing systems. The ultimate goal of the tool is to enable analysis of interactive system from source code.
Keywords: analysis, graphical user interfaces, source code

Interaction techniques and technologies

Feasible database querying using a visual end-user approach BIBAKFull-Text 187-192
  Clemente Rafael Borges; José Antonio Macías
Querying databases is a common daily task carried out by a great deal of end-users who do not have specific skills in SQL language. Today, most of the database interaction is achieved by means of query interfaces provided by the database environment. However, most of these interfaces suffer from expressive limitations, since they are mostly based on metaphors that drastically restrict the expressiveness of the SQL language that is generated and executed in the background. In this paper, we present a visual interaction language and tool focused on easily querying databases by end-users. We make no assumption on the level of the user's experience with query languages, as our visual metaphor is intended for querying databases by unskilled end-users and also leveraging the restriction on the expressiveness of the queries created by them. We also report on some late braking results obtained by an experiment carried out with real users.
Keywords: data warehouse, end-user development (EUD), usability, visual interfaces, web-based interaction
Letras: an architecture and framework for ubiquitous pen-and-paper interaction BIBAKFull-Text 193-198
  Felix Heinrichs; Jürgen Steimle; Daniel Schreiber; Max Mühlhäuser
Paper remains a prevalent medium in mobile usage contexts due to its inherent flexibility and robustness. Mobile computing solutions begin to provide powerful and convenient functionality, while the gap between paper documents and digital applications remains unbridged in mobile settings. Current toolkits do not offer adequate support for development of mobile pen-and-paper based applications, as they lack support for important mobile characteristics of real paper: user mobility and document mobility. To overcome their limitations, we present a novel generic architecture, along with its reference implementation Letras, a light-weight, freely available infrastructure to develop pen-and-paper based applications in mobile settings.
Keywords: anoto, development tools/toolkits/programming environments, digital pen, handheld devices and mobile computing, pen and tactile input, ubiquitous computing/smart environments
Adapting existing applications to support new interaction technologies: technical and usability issues BIBAKFull-Text 199-204
  Darren Andreychuk; Yaser Ghanam; Frank Maurer
Engineering interactive systems for use on emerging technologies such as touch-enabled devices and horizontal displays is not straightforward. Firstly, the migration process of a system from an old hardware platform to new multi-touch displays is challenging. Issues pertaining to scaling, orientation, new input mechanisms, novel interaction techniques and different SDKs need to be examined. Secondly, even after we manage to understand and resolve these issues, we need to find effective ways to migrate applications and maintain them.
   This paper contributes a thorough analysis of the technical and usability issues that need to be considered when migrating systems to different touch-enabled technologies including vertical and horizontal displays.
Keywords: adaptability, evolution, surfaces
Semantic awareness through computer vision BIBAKFull-Text 205-210
  Sami Benzaid; Prasun Dewan
An important application of multi-user interfaces is distributed presentations. In such presentations, the presenters do not have the ability to assess the real-time level-of-interest of the audience through observation, as they would in real lecture rooms. Using vision techniques, we aim to introduce a path that, if followed, could potentially lead to a robust technique that provides this information in such a presentation in real time.
Keywords: cscw, distributed presentation, semantic awareness

Keynote : Interaction techniques and technologies

Model engineering for model-driven engineering BIBAKFull-Text 211-212
  Axel van Lamsweerde
The effectiveness of model-driven engineering relies on our ability to build high-quality models. This task is intrinsically difficult. We need to produce sufficiently complete, adequate, consistent, and well-structured models from incomplete, imprecise, and sparse material originating from multiple, often conflicting sources. The system we need to consider in the early stages comprises software and environment components including people and devices.
   Such models should integrate the intentional, structural, functional, and behavioral facets of the system being developed. Rigorous techniques are needed for model construction, analysis, and evolution. They should support early and incremental reasoning about partial models for a variety of purposes, including satisfaction arguments, property checks, animations, the evaluation of alternative options, the analysis of risks, threats and conflicts, and traceability management. The tension between technical precision and practical applicability calls for a suitable mix of heuristic, deductive, and inductive forms of reasoning on a suitable mix of declarative and operational models. Formal techniques should be deployed only when and where needed, and kept hidden wherever possible.
   The talk will provide a retrospective account of our research efforts and practical experience along this route, including recent progress in model engineering for safety-critical medical workflows. Problem-oriented abstractions, analyzable models, and constructive techniques are pervasive concerns.
Keywords: model-driven engineering, requirements, system design

Collaboration, business and web orchestration

Collaboratively maintaining semantic consistency of heterogeneous concepts towards a common concept set BIBAKFull-Text 213-218
  Jingzhi Guo; Iok Ham Lam; Chun Chan; Guangyi Xiao
In e-business, creating a common concept set for business integration, interoperation and interaction has to consider the heterogeneity reality of different interpretations from multiple concept providers. Maintaining semantic consistency between multiple concept providers is a difficult problem. To solve this problem, this paper first reviewed the existing technologies of collaborative editing systems and consistency maintenance in the areas of both CSCW and e-business. Based on the discussion of existing technologies, it then proposes a novel CHCES approach, which divides a collaborative editing system into two layers in topology and introduces four strategies to edit common concepts between the two layers. A set of operations is designed, which demonstrates the solution.
Keywords: collaborative editing, electronic business, semantic consistency maintenance
Exploiting web service annotations in model-based user interface development BIBAKFull-Text 219-224
  Fabio Paternò; Carmen Santoro; Lucio Davide Spano
In this paper we present a method and the associated tool support able to exploit the content of Web service annotations in model-based user interface design and development. We also show an example application of the proposed approach.
Keywords: annotations, model-based user interface design, task models., web services
Mixed-focus collaboration without compromising individual or group work BIBAKFull-Text 225-234
  Prasun Dewan; Puneet Agarwal; Gautam Shroff; Rajesh Hegde
In mixed-focus collaboration, users "continuously" switch between "individual" and "group" work. We have developed a new two-person interaction mechanism, coupled tele-desktops, that is, arguably, not biased towards individual or group work. We evaluate this mechanism, and the general idea of mixed-focus collaboration, using a new quantitative framework consisting of (a) a set of precisely-defined coupling modes determining the extent of individual and group work, and (b) the times spent in, durations of, and number of transitions among these modes. We describe a new visualization scheme for compactly displaying these metrics in an individual collaborative session. We use this framework to characterize about forty six person hours of use of coupled tele-desktops, most of which involved collaborative use of a UI builder. Our results include (a) quantitative motivation for coupled tele-desktops, and (b) several new quantitative observations, and quantification of several earlier qualitative observations regarding mixed-focus collaboration.
Keywords: awareness, coupling, side-by-side collaboration
Virtual collaborative environments with distributed multitouch support BIBAKFull-Text 235-240
  Oscar Ardaiz; Ernesto Arroyo; Valeria Righi; Oriol Galimany; Josep Blat
In this paper, we present a new application framework aimed to support distributed synchronous collaboration using multitouch interaction. The framework supports 2D and 3D virtual workspaces that enable two or more users to collaboratively or cooperatively manipulate shared objects with multitouch interfaces. We present two applications developed with the aim to explore 2D/3D immersive collaborative environments with multitouch interaction. We also present our experience and preliminary results in designing, developing and integrating these applications on educational settings.
Keywords: distributed virtual environment, multitouch interaction, remote collaboration
Aligning business goals and user goals by engineering hedonic quality BIBAKFull-Text 241-250
  Kerstin Klöckner; Kirstin Kohler; Daniel Kerkow; Sabine Niebuhr; Claudia Nass
The following paper deals with quality properties that extend the traditional understanding of usability with its focus on the pragmatic aspects like efficiency and effectiveness on task performance in the context of business applications. The contribution of the approach is tow folded. First: We show how psychological theories about motivation and creativity can bridge the gap between business goals and users' goals and attitude with interaction design. We introduce an engineering approach that allows to deliberately design for fun/joy in a given business context. Second: We show how to reuse the experience from former or other projects by describing the interaction design as pattern candidates. This approach has been applied successfully many times and we elaborate it in a case study conducted for one of our clients, including an empirical evaluation that shows an improved working behavior and increased user acceptance of the software.
Keywords: business goals, fun-of-use, hedonic quality, interaction pattern, joy-of-use, user experience, user interface engineering
Activity-centric support for weakly-structured business processes BIBAKFull-Text 251-260
  Benedikt Schmidt; Todor Stoitsev; Max Mühlhäuser
Knowledge-intensive tasks are a blind spot for business process management systems, as these tasks are executed in an unsupervised, highly individual manner. Hence, individual experience is not disseminated and task execution largely depends on implicit knowledge.
   In this paper we present a framework, realizing situation-specific and personalized task execution support for knowledge-intensive tasks in business processes. As a core concept we suggest activity scheme: a structure capturing a probabilistic task execution model. Activity schemes seamlessly integrate the organizational business process with the individual task execution process based on personalization and generalization of user interactions in the working applications.
Keywords: human-computer interaction, knowledge work support, task execution support

Posters

Service discovery supported by task models BIBAKFull-Text 261-266
  Kyriakos Kritikos; Fabio Paternò
We propose an approach that takes as input a task model, which includes the user's view of the interactive system, and automatically discovers a set of categorized and ranked service descriptions for each system task of the model. In this way, a set of service operations can be used to implement an application's part or whole functionality so that its development time is significantly reduced.
Keywords: interactive application design, semantic service discovery, service front-ends, term to ontology concept matching
Design pattern TRABING: touchscreen-based input technique for people affected by intention tremor BIBAKFull-Text 267-272
  Alexander Mertens; Nicole Jochems; Christopher M. Schlick; Daniel Dünnebacke; Jan Henrik Dornberg
Tremor patients are frequently facing problems when interacting with IT systems and services. They do not reach the same levels of input efficiency and easily become unhappy with a technology they do not perceive as a general asset. Cases of Intention tremor show a significant comparative rise in inaccurate movement towards a real button or virtual buttons on touch screens, as this particular tremor increases its symptoms when approaching a desired physical target. People suffering from this specific tremor have been identified as the target group. This group has been closely investigated and thus, a new input procedure has been developed which may be used on standard touch screens. The new technique enables users, accordingly tremor patients, to fully operate IT-based systems and therefore possess full control over input. Deviations caused by the tremor are compensated with a continuous movement instead of a single targeted move which remains the most difficult task to the user. Also, the screen surface will present a frictional resistance, which significantly hinders tremor symptoms. Input can be identified by the computer system with high accuracy, by means of special heuristics, which support barrier free access beyond the target group.
Keywords: ambient assisted living (AAL), design pattern, human-computer interaction (hci), touchscreen, tremor
Model-driven GUI & interaction design using emulation BIBAKFull-Text 273-278
  Annika Hinze; Judy Bowen; Yuting Wang; Robi Malik
This paper introduces a model-driven emulator for the interaction and GUI design of complex interacting systems. It allows systems that are engineered using formal methods and modelling to be tested with users before the final implementation. The user interface requirements are also specified in a formal model, which can be tested manually and automatically as required.
Keywords: formal modelling, gui, interaction design, mobile
Using the mobile application EDDY for gathering user information in the requirement analysis BIBAKFull-Text 279-284
  Stephan Hammer; Karin Leichtenstern; Elisabeth André
The users' knowledge and requirements are important factors in the design of products. To ensure the success of products designers and developers have to get to know their target group users better. Over the years a lot of innovative user centered design studies have been written. Many researchers believe that these studies could be used by developers to help them in the process of creating new products. This paper presents a framework called EDDY that aims at facilitating the development of mobile applications for gathering various kinds of data. Such a framework and applications based on it should be helpful for the data collection during studies such as Cultural Probes and the Experience Sampling Method. These approaches involve the users as the persons who collect the required information themselves. In this paper we evaluate the advantages of EDDY for such studies. We investigated whether there is a significant advantage in using a mobile application for the documentation of a user's everyday life instead of using a classical kit.
Keywords: context-aware system, cultural probes, experience sampling method (ESM), mobile phones, requirement analysis
History-based device graphical user-interfaces BIBAKFull-Text 285-290
  Olufisayo Omojokun; Prasun Dewan
Due to limited screen space on mobile computers, device GUIs can span multiple screens-requiring tedious scrolling and tabbing for commands. History-based device GUIs can significantly reduce required space by only presenting the commands a user typically needs based on the user's behavior over a short training period. Moreover, history-based UIs and model-based UI generation are symbiotic. Generation relieves programmers from the overhead of logging and interpreting the interaction histories. Conversely, history-based user-interaction noticeably lowers inherent UI generation time by omitting unneeded commands.
Keywords: devices, logging, mobile computing, model-based user-interface generation, personalization, screen space, uims
Bridging the gap: empowering use cases with task models BIBAKFull-Text 291-296
  Daniel Sinnig; Rabeb Mizouni; Ferhat Khendek
Use cases have become the standard for modeling functional requirements, whereas task models are used to capture UI requirements. Despite recent advances, software engineering (SE) and user interface (UI) design methods are poorly integrated making it difficult for SE and UI teams to collaborate, synchronize their efforts and avoid inconsistencies. To address these issues, we propose an integrated development methodology for use cases and task models. Both artifacts are used to specify software requirements, but emphasize two different aspects in a complementary manner. The integration consists of using CTT task models to iteratively enrich UI related steps in the use case model. We demonstrate that such an approach allows for a clear separation of concerns and therefore avoids potential inconsistencies between the two artifacts.
Keywords: development methodology, task models, use cases, user interface development
UsabML: formalising the exchange of usability findings BIBAKFull-Text 297-302
  Johannes Feiner; Keith Andrews; Elmar Krajnc
During the iterative development of interactive software, formative evaluation is often performed to find and fix usability problems early on. The output of a formative evaluation usually takes the form of a prioritised list of usability findings, each finding typically consisting of a description of the problem, how often it occurred, and sometimes a recommendation for a possible solution.
   Unfortunately, the valuable results of formative evaluations are usually collected into a written document. This makes it extremely difficult to automate the handling of usability findings. A more formalised, electronic format for the handover of usability findings would make much more sense.
   UsabML is a formalised structure for reporting usability findings expressed in XML. It allows usability experts and software engineers to import usability findings into bug (issue) tracking systems, to associate usability issues with parts of source code, and to track progress in fixing them.
Keywords: issue tracking, software repositories, standard reporting format, usability findings, xml

Demonstrations

Seamless integration of heterogeneous UI components BIBAKFull-Text 303-308
  Heiko Paulheim; Atila Erdogan
Component-based software engineering is a paradigm aiming at better ways to reuse existing code and to distribute work across teams. Integrating UI components developed with different technologies can be a difficult task which can quickly can lead to code-tangling and loss of modularity. In this demo, we present a prototype framework for integrating heterogeneous UI components, using RDF and formal ontologies for unambiguous event and data exchange and minimizing dependencies between integrated components. We will show an example from the emergency management domain using components written in Java and Flex and demonstrate tight, seamless integration, including dragging and dropping objects from Java to Flex and vice versa.
Keywords: component-based software, integration, ontologies, user interfaces
Development of context-adaptive applications on the basis of runtime user interface models BIBAKFull-Text 309-314
  Grzegorz Lehmann; Marco Blumendorf; Sahin Albayrak
One of the challenges faced by developers of applications for smart environments is the diversity of contexts of use. Applications in smart environments must cope with continuously changing context of use, so the developers need to prepare them for a possibly broad range of situations. Since the developer has no access to all environments, in which her application will be executed, it must be possible to simulate different environments and evaluate the behavior of the application at design time. In our demonstration the designer has the possibility to simulate and modify a runtime context model and observe as her application adapts on the fly. In the underlying runtime architecture applications, defined as sets of models, are adapted automatically on the basis of the information held in the runtime context model. A visual tool enables the user interface developer to access and modify the models at any time and immediately observe the behavior of the application.
Keywords: adaptive user interfaces, executable models, model-based user interface development, model-driven engineering
A demonstration of the flexibility of widget generation BIBAKFull-Text 315-320
  Prasun Dewan
Several user-interface tools have been developed that (semi) automatically generate widgets for interacting with model objects. However, details of the nature of the model-widget mapping and the range of widget compositions that can be automatically created remain largely unpublished, and hence unknown. Moreover, most of this work has considered flat models. Using a variety of user-interfaces, which are proposed as benchmarks for evaluating widget generation, this paper demonstrates and derives a flexible and (semi) automatic algorithm for mapping between model and widget compositions.
Keywords: benchmarks, inheritance, layout, MVC

Doctoral consortium

Interactive model driven graphical user interface generation BIBAKFull-Text 321-324
  David Raneburger
The current multitude of devices with different screen resolutions or graphic toolkits requires different user interfaces (UIs) for the same application. Model Driven UI Development solves this problem by transforming one target device independent specification into several target device dependent UIs. However, the established Model Driven Architecture (MDA) transformation process is not flexible enough to fully support all requirements of UI development. The vision of this thesis is to bridge the gap between the capabilities of model driven software engineering and the requirements of UI development. This work introduces an interactive model driven UI development approach that gives the designer control over the UI during the development process. Additional interactive support enables the designer to make informed design decisions which will ultimately lead to more satisfying UIs.
Keywords: interactive user interface generation, model driven, semi-automatic customization
Understanding the influence of 3D virtual worlds on perceptions of 2D e-commerce websites BIBAKFull-Text 325-328
  Minh Q. Tran
This PhD research aims to understand the influence of consumers' experiences in 3D virtual worlds on their perceptions and expectations of 2D e-commerce websites. Interviews are being conducted with consumers who have shopping experiences in 3D and 2D environments. The contribution of this research will be an understanding of user needs in e-commerce environments. This understanding will be applied to develop e-commerce design guidelines in both 2D and 3D e-commerce environments.
Keywords: 3d virtual worlds, design guidelines, e-commerce, user experience
Integrating end-user support and negotiations to specify requirements for context-based adaptations in a collaboration environment BIBAKFull-Text 329-332
  Syed Sajid Hussain
Context-based adaptation reduces the overhead of work breakdown identification and handling from end-users by automated system-initiated adjustments of behavior of collaborative system. But this approach makes adaptation awareness, understanding and revision difficult. We propose a process model guiding the interaction among end-users when dealing with context-based adaptations to avoid these problems. This model also empowers end-users to add exceptions and to specify requirements to add, modify, or delete adaptation policies as amendment requests. End-user negotiations are conducted to remove conflicts because of conflicting views on adaptations and to form consensus before an amendment request is implemented.
Keywords: collaborative conflict management, contextual and situation-based collaboration, end-user support, requirement engineering in collaboration environments, socio technical information spaces
The fluid software metadata framework (FSM) BIBAKFull-Text 333-336
  Johannes Feiner
The Fluid Software Metadata (FSM) framework is a dynamic and flexible framework for software repository metadata generation and analysis. FSM aims to improve the setup time for the mining, analysis, and interactive visualisation of repository artefacts. FSM supports the integration of usability findings through XML-based usability reports. Viewing usability issues side-by-side with source code provides several advantages in holistic software development.
Keywords: development tools, framework, metrics, software repositories, usability issues, visualisation, web interface
A model- and pattern-based approach for development of user interfaces of interactive systems BIBAKFull-Text 337-340
  Jürgen Engel
This paper introduces the subject of my PhD thesis, a framework for pattern-based modeling, generation and usability evaluation of interactive systems. It describes the structural aspects of HCI pattern languages and how such languages and patterns for various modeling stages (e.g. task modeling) and abstraction levels can be exploited to automate part of the software development process for interactive applications. The main aspects and the general functionality of the framework as well as the supported development processes are discussed.
Keywords: interactive system, model-driven development, pattern-based development, software generation, user interface
Self-explanatory user interfaces by model-driven engineering BIBAKFull-Text 341-344
  Alfonso García Frey
Modern User Interfaces (UI) must deal with the increasing complexity of applications in terms of functionality as well as new properties as plasticity. The plasticity of a UI denotes its capacity of adaptation to the context of use preserving its quality. The efforts in plasticity have focused on the (meta) modeling of the UI, but the quality remains uncovered. We suggest a method for improving the quality of the UIs by providing explanations about the design of the UI itself: this is, by the use of the Self-Explanation. Self-Explanatory User Interfaces (SEUI) makes reference to the capacity of a UI to supply the end-user with all the information on the rational of the UI, about its constitution (for example, what is the purpose of this button?), its current state (why is the menu disabled?) as well as its evolution (how can I enable this feature?).
   This thesis investigates the SEUI by Model Driven Engineering (MDE), where models are kept at run-time allowing the necessary techniques that maintain this link between design and execution.
Keywords: design rationale, help, model transformation, model-driven engineering, self-explanatory user interfaces, ui quality
The triad-based design of rich user interfaces for internet applications BIBAKFull-Text 345-348
  Francisco Javier Martínez-Ruiz
Current trends in web development still are attached to the web page paradigm. Nevertheless, new uses of already available technology and recent development in terms of concepts, as the asynchronous communication, have produced a new generation of web applications: Rich Internet Applications (RIAs). These web applications tries to fulfill user expectations in terms of usability, reliability, quality, maintainability and performance. In our work, we are going to present a design methodology that pursued as goal describing and developing User Interfaces of RIAs in a standardized way.
Keywords: model driven engineering, rich internet applications, usixml., web engineering
Towards an evolutionary framework for agile requirements elicitation BIBAKFull-Text 349-352
  Sandra Kelly
Numerous reports document difficulties experienced with the development of requirements in software projects. Specifically problems include: developers have limited access to stakeholders, don't fully understand the problem domain and as a consequence, requirements are not well understood. Agile Methods (AMs) encourage stakeholder involvement throughout development however considerable difficulty remains in accommodating continuous negotiation between multiple diverse stakeholders in a given domain. This paper reports on progress to date for developing an evolutionary framework to improve the facilitation of agile requirements elicitation. A potential solution is offered and an initial study indicates positive results.
Keywords: agile methods (ams), elicitation, open space technology (OST), requirements, scenarios
UI generation from task, domain and user models: the DB-USE approach BIBAKFull-Text 353-356
  Vi Tran
Information Systems UI (User Interface) generation from declarative models has been the focus of numerous and various approaches in the human computer interaction community. Typically, the different approaches use the different models based on their singular aspects. This paper proposes a new process that combines the task, domain, and user models taken together to drive the information system user interface design and code behind generation. To this end, we propose a framework, i.e., a methodological process, a meta-model and a software prototype called DB-USE.
Keywords: automatic user interface generation, domain model, task model, user model
DOM tree estimation and computation: overview of a new web content adaptation system BIBAKFull-Text 357-360
  Jérémy Lardon; Christophe Gravier; Jacques Fayolle
In our ubiquitous and pervasive society, Web contents are accessible from a wide range of devices: PCs, smartphones, PDAs, but also TV set through a SetTopBox. The main issue with the arrival of these new browsing devices is the gap between their capabilities and the ones of a PC that most Web pages have as a design reference. Web content adaptation is a scientific field that aims at filling this gap by transforming Web content in order to fit the device capabilities. In this paper, a novel architecture, DTEC (for DOM Tree Estimation and Computation), is proposed for the automatic adaptation of Web pages.
Keywords: genetic algorithm, transcoding, transformation composition, web page adaptation

Workshops

User interface extensible markup language BIBAKFull-Text 361-362
  David Faure; Jean Vanderdonckt
This workshop is aimed at investigating open issues in research and development for user interface engineering based on User Interface eXtensible Markup Language (UsiXML), a XML-compliant User Interface Description Language and at reviewing existing solutions that address these issues.
Keywords: multi-context, multi-device environments, multilinguality, multi-organization, multi-user interfaces, multimodal
Design and engineering of game-like virtual and multimodal environments BIBAKFull-Text 363-364
  Chris Raymaekers; Karin Coninx; Juan Manuel González-Calleros
This workshop brings together a number of researchers that are involved in the design, engineering, evaluation and applicability of game-like virtual and multimodal environments. It is a forum to discuss experiences, best practices, and design and engineering approaches with a particular focus on those aspects that are related to the interactivity of the game.
Keywords: games, software engineering, virtual environments
Engineering patterns for multi-touch interfaces BIBAKFull-Text 365-366
  Kris Luyten; Davy Vanacken; Malte Weiss; Jan Borchers; Shahram Izadi; Daniel Wigdor
Multi-touch gained a lot of interest in the last couple of years and the increased availability of multi-touch enabled hardware boosted its development. However, the current diversity of hardware, toolkits, and tools for creating multi-touch interfaces has its downsides: there is only little reusable material and no generally accepted body of knowledge when it comes to the development of multi-touch interfaces. This workshop seeks a consensus on methods, approaches, toolkits, and tools that aid in the engineering of multi-touch interfaces and transcend the differences in available platforms. The patterns mentioned in the title indicate that we are aiming to create a reusable body of knowledge.
Keywords: eics workshop, engineering patterns, multi-touch interfaces
Pattern-driven engineering of interactive computing systems (PEICS) BIBAKFull-Text 367-368
  Kai Breiner; Marc Seissler; Gerrit Meixner; Peter Forbrig; Ahmed Seffah; Kerstin Klöckner
Since almost over one decade, patterns have been gaining a lot of interest in the domain of Human-Computer-Interaction (HCI) engineering. It is generally agreed upon that patterns can be used to facilitate the exchange of best practices and knowledge between the interdisciplinary team members, involved in interactive systems design process. Despite intense research activities in the last years, HCI patterns still lack in a standardized description and organization. This makes it difficult for the developers to identify the relevant patterns for solving a problem as well as to apply them accordingly to the problem context.
   To fully benefit from HCI patterns within the engineering of interactive computer systems they have to be prepared for integration into a model-based user interface development process. Instead of guiding and advising the UI developers of which solution should be applied, HCI patterns should enable the easy reuse of already designed model or code fragments. To enable the integration of HCI patterns in the model-based development process the informal textual, or graphical notation of HCI patterns has to be overcome. HCI patterns have to support the formal description of their solution-part, which allows the direct integration of the solution-parts into the different models, like task-, dialog- and presentation-model.
Keywords: hci pattern, model-based development

Tutorials

Bringing users' conceptual models into design: an introduction to CASSM analysis BIBAKFull-Text 369-370
  Ann E. Blandford
Interactive systems have to fit the needs of their users. In an evolutionary development cycle, the evaluation of existing systems serves as a foundation for designing improved systems that better fit people's needs. Few evaluation methods encourage the analyst to step back and consider how well a system supports users' conceptual understandings and system utility. CASSM, the approach presented in this course, focuses on the quality of 'fit' between users and an interactive system. This course presents the methodology of gathering suitable data and conducting a CASSM analysis, and show how CASSM can help identify re-design possibilities to improve system utility. CASSM complements established evaluation methods by focusing on conceptual structures rather than procedures. It also provides a guiding framework for analysts working with qualitative data such as think-aloud or interview protocols.
Keywords: CASSM, conceptual fit, system evaluation
Model a discourse and transform it to your user interface BIBAKFull-Text 371-372
  Hermann Kaindl
Every interactive system needs a user interface, today possibly even several ones adapted for different devices (PCs, PDAs, mobile phones). Developing a user interface is difficult and takes a lot of effort, since it normally requires design and implementation. This is also expensive, and even more so for several user interfaces for different devices.
   This tutorial shows how human-computer interaction can be based on discourse modeling, even without employing speech or natural language. Our discourse models are derived from results of Human Communication theories, Cognitive Science and Sociology. Such discourse models can specify an interaction design. This tutorial also demonstrates how such an interaction design can be used for model-driven generation of user interfaces and linking them to the application logic and the domain of discourse.
Keywords: discourse modeling, model-driven user interface generation
Mastering use cases: capturing functional requirements for interactive applications BIBAKFull-Text 373-374
  Daniel Sinnig; Homa Javahery
Use cases were introduced in the early 90s by Jacobson. He defined a use case as a "specific way of using the system by using some part of the functionality." Use case modeling is making its way into mainstream practice as a key activity in the software development process (e.g., Unified Process). There is accumulating evidence of significant benefits to customers and developers. The use case model is the artifact of choice for capturing functional requirements and as such, serves as a contract of the envisioned system behavior between stakeholders. It drives the architecture of the application, it can be used to generate functional test cases and often serves as a reference point for maintenance and documentation purposes. Writing effective and well-structured use cases is a difficult task which requires a deep understanding of the surrounding techniques and best practices. Current practice has shown that it is easy to misuse them or make mistakes that can unintentionally turn them into "abuse cases".
Keywords: formalization, guidelines, heuristics, refactoring, use cases, user interface models