| Performance Analysis of an Adaptive User Interface System Based on Mobile Agents | | BIBA | Full-Text | 1-17 | |
| Nikola Mitrovic; Jose A. Royo; Eduardo Mena | |||
| Adapting graphical user interfaces for various user devices is one of the most interesting topics in today's mobile computation. In this paper we present a system based on mobile agents that transparently adapts user interface specifications to the user device' capabilities and monitors user interaction. Specialized agents manage GUI specification according to the specific context and user preferences. We show how the user behavior can be monitored at run-time in a transparent way and how learning methods are applied to anticipate future user actions and to adapt the user interface accordingly. The feasibility and performance of our approach are shown by applying our approach to a non-trivial application and by performing tests with real users. | |||
| Combining Human Error Verification and Timing Analysis | | BIBAK | Full-Text | 18-35 | |
| Rimvydas Rukšenas; Paul Curzon; Ann Blandford; Jonathan Back | |||
| Designs can often be unacceptable on performance grounds. In this work, we
integrate a GOMS-like ability to predict execution times into the generic
cognitive architecture developed for the formal verification of human error
related correctness properties. As a result, formal verification and GOMS-like
timing analysis are combined within a unified framework. This allows one to
judge whether a formally correct design is also acceptable on performance
grounds, and vice versa. We illustrate our approach with an example based on a
KLM style timing analysis. Keywords: Human error; formal verification; execution time; GOMS; cognitive
architecture; model checking; SAL | |||
| Formal Testing of Multimodal Interactive Systems | | BIBA | Full-Text | 36-52 | |
| Jullien Bouchet; Laya Madani; Laurence Nigay; Catherine Oriat; Ioannis Parissis | |||
| This paper presents a method for automatically testing interactive multimodal systems. The method is based on the Lutess testing environment, originally dedicated to synchronous software specified using the Lustre language. The behaviour of synchronous systems, consisting of cycles starting by reading an external input and ending by issuing an output, is to a certain extent similar to the one of interactive systems. Under this hypothesis, the paper presents our method for automatically testing interactive multimodal systems using the Lutess environment. In particular, we show that automatic test data generation based on different strategies can be carried out. Furthermore, we show how multimodality-related properties can be specified in Lustre and integrated in test oracles. | |||
| Knowledge Representation Environments: An Investigation of the CASSMs between Creators, Composers and Consumers | | BIBAK | Full-Text | 53-70 | |
| Ann Blandford; Thomas R. G. Green; Iain Connell; Tony Rose | |||
| Many systems form 'chains' whereby developers use one system (or 'tool') to
create another system, for use by other people. For example, a web development
tool is created by one development team then used by others to compose web
pages for use by yet other people. Little work within Human-Computer
Interaction (HCI) has considered how usability considerations propagate through
such chains. In this paper, we discuss three-link chains involving people that
we term Creators (commonly referred to as designers), Composers (users of the
tool who compose artefacts for other users) and Consumers (end users of
artefacts). We focus on usability considerations and how Creators can develop
systems that are both usable themselves and also support Composers in producing
further systems that Consumers can work with easily. We show how CASSM, an
analytic evaluation method that focuses attention on conceptual structures for
interactive systems, supports reasoning about the propagation of concepts
through Creator-Composer-Consumer chains. We use as our example a knowledge
representation system called Tallis, which includes specific implementations of
these different perspectives. Tallis is promoting a development culture within
which individuals are empowered to take on different roles in order to
strengthen the 'chain of comprehension' between different user types. Keywords: Usability evaluation methods; CASSM; design chains | |||
| Consistency between Task Models and Use Cases | | BIBAK | Full-Text | 71-88 | |
| Daniel Sinnig; Patrice Chalin; Ferhat Khendek | |||
| Use cases are the notation of choice for functional requirements
documentation, whereas task models are used as a starting point for user
interface design. In this paper, we motivate the need for an integrated
development methodology in order to narrow the conceptual gap between software
engineering and user interface design. This methodology rests upon a common
semantic framework for developing and handling use cases and task models. Based
on the intrinsic characteristic of both models we define a common formal
semantics and provide a formal definition of consistency between task models
and use cases. The semantic mapping and the application of the proposed
consistency definition are supported by an illustrative example. Keywords: Use cases; task models; finite state machines; formal semantics; consistency | |||
| Task-Based Design and Runtime Support for Multimodal User Interface Distribution | | BIBAK | Full-Text | 89-105 | |
| Tim Clerckx; Chris Vandervelpen; Karin Coninx | |||
| This paper describes an approach that uses task modelling for the
development of distributed and multimodal user interfaces. We propose to enrich
tasks with possible interaction modalities in order to allow the user to
perform these tasks using an appropriate modality. The information of the
augmented task model can then be used in a generic runtime architecture we have
extended to support runtime decisions for distributing the user interface among
several devices based on the specified interaction modalities. The approach was
tested in the implementation of several case studies. One of these will be
presented in this paper to clarify the approach. Keywords: Task-based development; model-based user interface development; distributed
user interfaces; multimodal user interfaces | |||
| A Comprehensive Model of Usability | | BIBAK | Full-Text | 106-122 | |
| Sebastian Winter; Stefan Wagner; Florian Deissenboeck | |||
| Usability is a key quality attribute of successful software systems.
Unfortunately, there is no common understanding of the factors influencing
usability and their interrelations. Hence, the lack of a comprehensive basis
for designing, analyzing, and improving user interfaces. This paper proposes a
2-dimensional model of usability that associates system properties with the
activities carried out by the user. By separating activities and properties,
sound quality criteria can be identified, thus facilitating statements
concerning their interdependencies. This model is based on a tested quality
meta-model that fosters preciseness and completeness. A case study demonstrates
the manner by which such a model aids in revealing contradictions and omissions
in existing usability standards. Furthermore, the model serves as a central and
structured knowledge base for the entire quality assurance process, e.g. the
automatic generation of guideline documents. Keywords: Usability; quality models; quality assessment | |||
| Suitability of Software Engineering Models for the Production of Usable Software | | BIBAK | Full-Text | 123-139 | |
| Karsten Nebe; Dirk Zimmermann | |||
| Software Engineering (SE) and Usability Engineering (UE) both provide a wide
range of elaborated process models to create software solutions. Today, many
companies have understood that a systematic and structured approach to
usability is as important as the process of software development itself.
However, theory and practice is still scarce how to incorporate UE methods into
development processes. With respect to the quality of software solutions,
usability needs to be an integral aspect of software development and therefore
the integration of these two processes is a logical and needed step. One
challenge is to identify integration points between the two disciplines that
allow a close collaboration, with acceptable additional organizational and
operational efforts. This paper addresses the questions of where these
integration points between SE and UE exist, what kind of fundamental UE
activities have to be integrated in existing SE processes, and how this
integration can be accomplished. Keywords: Software Engineering; Usability Engineering; Standards; Models; Processes;
Integration | |||
| A Model-Driven Engineering Approach for the Usability of Plastic User Interfaces | | BIBAK | Full-Text | 140-157 | |
| Jean-Sébastien Sottet; Gaëlle Calvary; Joëlle Coutaz; Jean-Marie Favre | |||
| Plastic User Interfaces (UI) are able to adapt to their context of use while
preserving usability. Research efforts have focused so far, on the functional
aspect of UI adaptation, while neglecting the usability dimension. This paper
investigates how the notion of mapping as promoted by Model Driven Engineering
(MDE), can be exploited to control UI adaptation according to explicit
usability criteria. In our approach, a run-time UI is a graph of models related
by mappings. Each model (e.g., the task model, the Abstract UI, the Concrete
UI, and the final UI) describes the UI from a specific perspective from
high-level design decisions (conveyed by the task model) to low-level
executable code (i.e. the final UI). A mapping between source and target models
specifies the usability properties that are preserved when transforming source
models into target models. This article presents a meta-model for the notion of
mapping and shows how it is applied to plastic UIs. Keywords: Adaptation; Context of use; Mapping; Meta-model; Model; Model
transformation; Plasticity; Usability | |||
| Model-Driven Prototyping for Corporate Software Specification | | BIBAK | Full-Text | 158-174 | |
| Thomas Memmel; Carsten Bock; Harald Reiterer | |||
| Corporate software development faces very demanding challenges, especially
concerning the design of user interfaces. Collaborative design with
stakeholders demands modeling methods that everybody can understand and apply.
But when using traditional, paper-based methods to gather and document
requirements, an IT organization often experiences frustrating communication
issues between the business and development teams. We present ways of
implementing model-driven prototyping for corporate software development.
Without harming agile principles and practice, detailed prototypes can be
employed for collaborative design. Model-driven prototyping beats a new path
towards visual specifications and the substitution of paper-based artifacts. Keywords: Prototyping; model-driven user interface design; UI specification; corporate
software development; agile modeling | |||
| Getting SW Engineers on Board: Task Modelling with Activity Diagrams | | BIBAK | Full-Text | 175-192 | |
| Jens Brüning; Anke Dittmar; Peter Forbrig; Daniel Reichart | |||
| This paper argues for a transfer of knowledge and experience gained in
task-based design to Software Engineering. A transformation of task models into
activity diagrams as part of UML is proposed. By using familiar notations,
software engineers might be encouraged to accept task modelling and to pay more
attention to users and their tasks. Generally, different presentations of a
model can help to increase its acceptance by various stakeholders. The
presented approach allows both the visualization of task models as activity
diagrams as well as task modelling with activity diagrams. Corresponding tool
support is presented which includes the animation of task models. The tool
itself was developed in a model-based way. Keywords: HCI models and model-driven engineering; task modelling; UML | |||
| Considering Context and Users in Interactive Systems Analysis | | BIBAK | Full-Text | 193-209 | |
| José Creissac Campos; Michael D. Harrison | |||
| Although the take-up of formal approaches to modelling and reasoning about
software has been slow, there has been recent interest and facility in the use
of automated reasoning techniques such as model checking [5] on increasingly
complex systems. In the case of interactive systems, formal methods can be
particularly useful in reasoning about systems that involve complex
interactions. These techniques for the analysis of interactive systems
typically focus on the device and leave the context of use undocumented. In
this paper we look at models that incorporate complexity explicitly, and
discuss how they can be used in a formal setting. The paper is concerned
particularly with the type of analysis that can be performed with them. Keywords: Interactive systems; modelling; analysis; context | |||
| XSED -- XML-Based Description of Status--Event Components and Systems | | BIBAK | Full-Text | 210-226 | |
| Alan Dix; Jair Leite; Adrian Friday | |||
| Most user interfaces and ubiquitous systems are built around event-based
paradigms. Previous work has argued that interfaces, especially those heavily
depending on context or continuous data from sensors, should also give
attention to status phenomena -- that is continuously available signals and
state. Focusing on both status and event phenomena has advantages in terms of
adequacy of description and efficiency of execution. This paper describes a
collection of XML-based specification notations (called XSED) for describing,
implementing and optimising systems that take account of this dual status-event
nature of the real world. These notations cover individual components, system
configuration, and separated temporal annotations. Our work also presents a
implementation to generate Status-Event Components that can run in a
stand-alone test environment. They can also be wrapped into a Java Bean to
interoperate with other software infrastructure, particularly the ECT platform. Keywords: Status-event analysis; reflective dialogue notation; ubiquitous computing
infrastructure; XML; temporal properties | |||
| Identifying Phenotypes and Genotypes: A Case Study Evaluating an In-Car Navigation System | | BIBAK | Full-Text | 227-242 | |
| Georgios Papatzanis; Paul Curzon; Ann Blandford | |||
| There are a range of different usability evaluation methods: both analytical
and empirical. The appropriate choice is not always clear, especially for new
technologies. In-car navigation systems are an example of how multimodal
technologies are increasingly becoming part of our everyday life. Their
usability is important, as badly designed systems can induce errors resulting
in situations where driving safety may be compromised. In this paper we use a
study on the usability of a navigation device when the user is setting set up
an itinerary to investigate the scope of different classes of approach. Four
analytical and one empirical techniques were used to evaluate the usability of
the device. We analyse the results produced by the two classes of approach --
analytical versus empirical -- and compare them in terms of their diversity and
the insight they provide to the analyst in respect to the overall usability of
the system and its potential improvement. Results suggest a link between
genotypes and the analytical class of approach and phenotypes in the empirical
class of approach. We also illustrate how the classes of approach complement
each other, providing a greater insight into the usability of a system. Keywords: Usability evaluation; UEMs; In-car navigation; Cognitive Walkthrough; UAN;
EMU; Design Criteria; Phenotypes; Genotypes | |||
| Factoring User Experience into the Design of Ambient and Mobile Systems | | BIBA | Full-Text | 243-259 | |
| Michael D. Harrison; Christian Kray; Zhiyu Sun; Huqiu Zhang | |||
| The engineering of ubiquitous computing systems provides important challenges. Not least among these is the need to understand how to implement designs that create a required experience for users. The paper explores a particular class of such systems for built environments. In particular it is concerned with the capture of experience requirements and production of prototypes that create experience. The aim is to develop methods and tools for such environments to enable the creation of particular sorts of experience in users. An approach that combines the use of scenarios, personae and snapshots with the use of prototypes and models is described. The technique aims to elicit an understanding of the required experience of the system and then create a design that satisfies the requirements. | |||
| Visualisation of Personal Communication Patterns Using Mobile Phones | | BIBAK | Full-Text | 260-274 | |
| Bradley van Tonder; Janet Wesson | |||
| Ambient displays are attractive, subtle visualisations of information. They
are typically situated on the periphery of human perception, requiring minimal
effort to be understood. The vast volume of communication facilitated by modern
means of communication has led to research into methods of visualising this
information to facilitate rapid understanding. These two research areas have,
however, seldom been combined to include the use of ambient displays as
visualisations of personal communication patterns. The research outlined in
this paper addresses this issue by combining ambient displays and visualisation
of personal communication patterns in a mobile context. This paper details the
development of the AmbiMate system, analyses its usefulness and investigates
the lessons which can be learned from its implementation in order to guide the
future development of such systems. Keywords: Ambient displays; visualisation; personal communication patterns; mobile
devices | |||
| Integration of Distributed User Input to Extend Interaction Possibilities with Local Applications | | BIBA | Full-Text | 275-284 | |
| Kay Kadner; Stephan Mueller | |||
| Computing devices do not offer every modality for interaction that a user might want to choose for interacting with an application. Instead of buying new hardware for extending the interaction capabilities, it should be possible to leverage modalities of independent existing devices that are in the vicinity. Therefore, an architecture has to be developed that gathers events on distributed devices and transfers them to the local device for execution. This allows the user to choose devices even at runtime that are better suited for a particular input task. For a convenient use, the system should support input that can be both independent and dependent from the application. Application-dependent input commands imply that meta-information about the application is provided. Since the system should allow the extension of existing applications, the meta-information has to be provided in a way that is transparent for the application. The following paper describes a system that realises those features. | |||
| Reverse Engineering Cross-Modal User Interfaces for Ubiquitous Environments | | BIBAK | Full-Text | 285-302 | |
| Renata Bandelloni; Fabio Paternò; Carmen Santoro | |||
| Ubiquitous environments make various types of interaction platforms
available to users. There is an increasing need for automatic tools able to
transform user interfaces for one platform into versions suitable for a
different one. To this end, it is important to have solutions able to take user
interfaces for a given platform and build the corresponding logical
descriptions, which can then be manipulated to obtain versions adapted to
different platforms. In this paper we present a solution to this issue that is
able to reverse engineer even interfaces supporting different modalities
(graphical and voice). Keywords: Reverse Engineering; Cross-Modal User Interfaces; Model-based Approaches | |||
| Intelligent Support for End-User Web Interface Customization | | BIBAK | Full-Text | 303-320 | |
| José A. Macías; Fabio Paternò | |||
| Nowadays, while the number of users of interactive software steadily
increase, new applications and systems appear and provide further complexity.
An example of such systems is represented by multi-device applications, where
the user can interact with the system through different platforms. However,
providing end-users with real capabilities to author user interfaces is still a
problematic issue, which is beyond the ability of most end-users today. In this
paper, we present an approach intended to enable users to modify Web interfaces
easily, considering implicit user intents inferred from example interface
modifications carried out by the user. We discuss the design issues involved in
the implementation of such an intelligent approach, also reporting on some
experimental results obtained from a user test. Keywords: End-User Development; Intelligent User Interfaces; Model-Based Design of
User Interfaces; Programming by Example | |||
| Improving Modularity of Interactive Software with the MDPC Architecture | | BIBAK | Full-Text | 321-338 | |
| Stéphane Conversy; Eric Barboni; David Navarre; Philippe Palanque | |||
| The "Model -- Display view -- Picking view -- Controller" model is a
refinement of the MVC architecture. It introduces the "Picking View" component,
which offloads the need from the controller to analytically compute the picked
element. We describe how using the MPDC architecture leads to benefits in terms
of modularity and descriptive ability when implementing interactive components.
We report on the use of the MDPC architecture in a real application: we
effectively measured gains in controller code, which is simpler and more
focused. Keywords: MVC; interactive software; modularity; Model Driven Architecture | |||
| Toward Quality-Centered Design of Groupware Architectures | | BIBA | Full-Text | 339-355 | |
| James Wu; T. C. Nicholas Graham | |||
| Challenges in designing effective groupware include technical issues associated with concurrent and distributed work and social issues associated with supporting group activities. To address some of these problems, we have developed a quality-centered architectural design framework that links requirements analysis to architectural design decisions for groupware systems. The framework supports reasoned architectural design choices that are used to tailor software architecture to the unique quality and functional requirements of the software being developed. The framework has been applied to the development of the Software Design Board, a tool for collaborative software engineering. | |||
| Programs = Data + Algorithms + Architecture: Consequences for Interactive Software Engineering | | BIBA | Full-Text | 356-373 | |
| Stéphane Chatty | |||
| This article analyses the relationships between software architecture, programming languages and interactive systems. It proposes to consider that languages, like user interface tools, implement architecture styles or patterns aimed at particular stakeholders and scenarios. It lists architecture issues in interactive software that would be best resolved at the language level, in that conflicting patterns are currently proposed by languages and user interface tools, because of differences in target scenarios. Among these issues are the contra-variance of reuse and control, new scenarios of software reuse, the architecture-induced concurrency, and the multiplicity of hierarchies. The article then proposes a research agenda to address that problem, including a requirement-and scenario-oriented deconstruction of programming languages to understand which of the original requirements still hold and which are not fully adapted to interactive systems. | |||
| Towards an Extended Model of User Interface Adaptation: The Isatine Framework | | BIBAK | Full-Text | 374-392 | |
| Víctor López-Jaquero; Jean Vanderdonckt; Francisco Montero; Pascual González | |||
| In order to cover the complete process of user interface adaptation, this
paper extends Dieterich's taxonomy of user interface adaptation by specializing
Norman's theory of action into the Isatine framework. This framework decomposes
user interface adaptation into seven stages of adaptation: goals for
adaptation, initiative, specification, application, transition, interpretation,
and evaluation. The purpose of each stage is defined and could be ensured
respectively by the user, the interactive system, a third party, or any
combination of these entities. The potential collaboration between these
entities suggests defining additional support operations such as negotiation,
transfer, and delegation. The variation and the complexity of adaptation
configurations induced by the framework invited us to introduce a multi-agent
adaptation engine, whose each agent is responsible for achieving one stage at a
time (preferably) or a combination of them (in practice). In this engine, the
adaptation rules are explicitly encoded in a knowledge base, from which they
can be retrieved on demand and executed. In particular, the application of
adaptation rules is ensured by examining the definition of each adaptation rule
and by interpreting them at run-time, based on a graph transformation system.
The motivations for this multi-agent system are explained and the
implementation of the engine is described in these terms. In order to
demonstrate that this multi-agent architecture allows an easy reconfigurability
of the interactive system to accommodate the various adaptations defined in the
framework, a case study of a second-hand car-selling system is detailed from a
simple adaptation to progressively more complex ones. Keywords: Adaptation; adaptation configuration; delegation; isatin; Isatine framework;
mixed-initiative user interface; multi-agent system; negotiation;
reconfiguration of user interface; transfer; user interface description
language | |||
| Towards a Universal Toolkit Model for Structures | | BIBAK | Full-Text | 393-412 | |
| Prasun Dewan | |||
| Model-based toolkit widgets have the potential for (i) increasing automation
and (ii) making it easy to substitute a user-interface with another one.
Current toolkits, however, have focused only on the automation benefit as they
do not allow different kinds of widgets to share a common model. Inspired by
programming languages, operating systems and database systems that support a
single data structure, we present here an interface that can serve as a model
for not only the homogeneous model-based structured-widgets identified so far
-- tables and trees -- but also several heterogeneous structured-widgets such
as forms, tabbed panes, and multi-level browsers. We identify an architecture
that allows this model to be added to an existing toolkit by automatically
creating adapters between it and existing widget-specific models. We present
several full examples to illustrate how such a model can increase both the
automation and substitutability of the toolkit. We show that our approach
retains model purity and, in comparison to current toolkits, does not increase
the effort to create existing model-aware widgets. Keywords: Tree; table; form; tab; browser; hashtable; vector; sequence; toolkit; model
view controller; user interface management system | |||
| Exploring Human Factors in Formal Diagram Usage | | BIBA | Full-Text | 413-428 | |
| Andrew Fish; Babak Khazaei; Chris Roast | |||
| Formal diagrammatic notations have been developed as alternatives to symbolic specification notations. Ostensibly to aid users in performing comprehension and reasoning tasks, restrictions called wellformedness conditions may be imposed. However, imposing too many of these conditions can have adverse effects on the utility of the notation (e.g. reducing the expressiveness). Understanding the human factors involved in the use of a notation, such as how user-preference and comprehension relate to the imposition of wellformedness conditions, will enable the notation designers to make more informed design decisions. Euler diagrams are a simple visualization of set-theoretic relationships which are the basis of more expressive constraint languages. We have performed exploratory studies with Euler diagrams which indicated that novice user preferences strongly conform to the imposition of all wellformedness conditions, but that even a limited exposure diminishes this preference. | |||
| 'Aware of What?' A Formal Model of Awareness Systems That Extends the Focus-Nimbus Model | | BIBAK | Full-Text | 429-446 | |
| Georgios Metaxas; Panos Markopoulos | |||
| We present a formal-model of awareness-systems founded upon the focus and
nimbus model of Benford et al [2] and of Rodden [19]. The model aims to provide
a conceptual tool for reasoning about this class of systems. Our model
introduces the notions of aspects, attributes and resources in order to expose
the communicational aspects of awareness-systems. We show how the model enables
reasoning about issues such as deception and plausible deniability, which
arguably are crucial for enabling users to protect their privacy and to manage
how they present themselves to their social network. Keywords: CSCW; formal models; awareness systems; focus-nimbus; Z | |||
| Service-Interaction Descriptions: Augmenting Services with User Interface Models | | BIBAK | Full-Text | 447-464 | |
| Jo Vermeulen; Yves Vandriessche; Tim Clerckx; Kris Luyten; Karin Coninx | |||
| Semantic service descriptions have paved the way for flexible interaction
with services in a mobile computing environment. Services can be automatically
discovered, invoked and even composed. On the contrary, the user interfaces for
interacting with these services are often still designed by hand. This approach
poses a serious threat to the overall flexibility of the system. To make the
user interface design process scale, it should be automated as much as
possible. We propose to augment service descriptions with high-level user
interface models to support automatic user interface adaptation. Our method
builds upon OWL-S, an ontology for Semantic Web Services, by connecting a
collection of OWL-S services to a hierarchical task structure and selected
presentation information. This allows end-users to interact with services on a
variety of platforms. Keywords: Model-based user interface development; Semantic web services; Screen
layout; Automatic generation of user interfaces; User interface design;
Ubiquitous computing | |||
| A Design-Oriented Information-Flow Refinement of the ASUR Interaction Model | | BIBAK | Full-Text | 465-482 | |
| Emmanuel Dubois; Philip Gray | |||
| The last few years have seen an explosion of interaction possibilities
opened up by ubiquitous computing, mobile devices, and tangible interaction.
Our methods of modelling interaction, however, have not kept up. As is to be
expected with such a rich situation, there are many ways in which interaction
might be modelled, focussing, for example, on user tasks, physical location(s)
and mobility, data flows or software elements. In this paper, we present a
model and modelling technique intended to capture key aspects of user's
interaction of interest to interactive system designers, at the stage of
requirements capture and early design. In particular, we characterise the
interaction as a physically mediated information exchange, emphasizing the
physical entities involved and their relationships with the user and with one
another. We apply the model to two examples in order to illustrate its
expressive power. Keywords: Mixed Interactive Systems; User's Interaction Modelling; Requirements
Capture; Information flow characterisation; Design Analysis; Interaction Path | |||
| On the Process of Software Design: Sources of Complexity and Reasons for Muddling through | | BIBAK | Full-Text | 483-500 | |
| Morten Hertzum | |||
| Software design is a complex undertaking. This study delineates and analyses
three major constituents of this complexity: the formative element entailed in
articulating and reaching closure on a design, the progress imperative entailed
in making estimates and tracking status, and the collaboration challenge
entailed in learning within and across projects. Empirical data from two small
to medium-size projects illustrate how practicing software designers struggle
with the complexity induced by these constituents and suggest implications for
user-centred design. These implications concern collaborative grounding,
long-loop learning, and the need for a more managed design process while
acknowledging that methods are not an alternative to the project knowledge
created, negotiated, and refined by designers. Specifically, insufficient
collaborative grounding will cause project knowledge to gradually disintegrate,
but the activities required to avoid this may be costly in terms of scarce
resources such as the time of key designers. Keywords: User-centred design; Design process; Software development; Software-project
complexity; Muddling through; Collaborative grounding | |||
| Applying Graph Theory to Interaction Design | | BIBA | Full-Text | 501-519 | |
| Harold Thimbleby; Jeremy Gow | |||
| Graph theory provides a substantial resource for a diverse range of
quantitative and qualitative usability measures that can be used for evaluating
recovery from error, informing design tradeoffs, probing topics for user
training, and so on.
Graph theory is a straight-forward, practical and flexible way to implement real interactive systems. Hence, graph theory complements other approaches to formal HCI, such as theorem proving and model checking, which have a less direct relation to interaction. This paper gives concrete examples based on the analysis of a real non-trivial interactive device, a medical syringe pump, itself modelled as a graph. New ideas to HCI (such as small world graphs) are introduced, which may stimulate further research. | |||
| Mathematical Mathematical User Interfaces | | BIBA | Full-Text | 520-536 | |
| Harold Thimbleby; Will Thimbleby | |||
| Taking Mathematica and xThink as representatives of the state of the art in interactive mathematics, we argue conventional mathematical user interfaces leave much to be desired, because they separate the mathematics from the context of the user interface, which remains as unmathematical as ever. We put the usability of such systems into mathematical perspective, and compare the conventional approach with a novel declarative, gesture-based approach, exemplified by TruCalc, a novel calculator we have developed. | |||
| Coupling Interaction Resources in Ambient Spaces: There Is More Than Meets the Eye! | | BIBAK | Full-Text | 537-554 | |
| Nicolas Barralon; Joëlle Coutaz | |||
| Coupling is the action of binding two entities so that they can operate
together to provide new functions. In this article, we propose a formal
definition for coupling and present two complementary conceptual tools to
reason about coupling interaction resources. The first tool is a graph
theoretic and algebraic notation that can be used to identify the consequents
of causal couplings so that the side-effects of the creation of a coupling can
be analyzed in a formal and systematic way. The second tool formulates the
problem of coupling using an 8 state automaton that models the life cycle of a
coupling and provides designers with a structure to verify that usability
properties have been satisfied for each state. We conclude with the concept of
meta-UI, an overarching interactive system that shows that coupling is only one
aspect of a larger problem space. Keywords: Ubiquitous computing; ambient intelligence; ambient interactive spaces;
devices assembly; devices coupling; meta-UI | |||
| Building and Evaluating a Pattern Collection for the Domain of Workflow Modeling Tools | | BIBAK | Full-Text | 555-566 | |
| Kirstin Kohler; Daniel Kerkow | |||
| In this paper, we present the results of a case study conducted together
with a small company that develops a workflow modeling tool. During the case
study, we created a pattern collection for the domain of workflow modeling
tools and evaluated a subset of these patterns. Beside the pattern description
itself, the contribution of our work is a systematic process for identifying
patterns. The results of the case study showed, that the identified pattern are
a valuable instrument for software developers to improve the usability of their
software in the given domain. Additionally this finding shows that the process
of pattern identification is valuable as well. Keywords: User interface pattern; case study; design methodologies; H5.2. Information interfaces and presentations, Miscellaneous theory and
methods, D.2.2 Software Engineering, Design Tools and Techniques | |||
| Do We Practise What We Preach in Formulating Our Design and Development Methods? | | BIBAK | Full-Text | 567-585 | |
| Paula Kotzé; Karen Renaud | |||
| It is important, for our credibility as user interface designers and
educators, that we practice what we preach. Many system designers and
programmers remain sceptical about the need for user-centred design. To win
them over, we need to be absolutely clear about what they need to do. We, as a
community, propose many different methods to support naïve designers so
that they will design and implement user-centred systems. One of the most
popular methods is HCI design patterns -- captured and formulated by experts
for the sole purpose of transferring knowledge to novices. In this paper we
investigate the usability of these patterns, using both theoretical and
experimental analysis, and conclude that they are not usable. Hence,
unfortunately, we have to conclude that we don't practice what we preach. We
conclude the paper by making some suggestions about how we can address this
situation. Keywords: Design patterns; usability; learnability; memorability; efficiency; errors;
satisfaction | |||
| Engaging Patterns: Challenges and Means Shown by an Example | | BIBAK | Full-Text | 586-600 | |
| Sabine Niebuhr; Kirstin Kohler; Christian Graf | |||
| This paper presents first results of a research project whose goal is to
develop a pattern language that enhances business software by motivating and
engaging elements. The goal of the pattern language is to turn the soft and
vague term of "emotions in user interaction design" into constructive design
guidance. The patterns are especially tailored for joy-of-use in business
applications. The main contribution of this paper is the description of quality
characteristics for this pattern language. They are illustrated by references
to existing pattern descriptions and elaborating their deficiencies. This paper
shows how these weaknesses were addressed in the pattern language. Keywords: D.2.1 Requirements/Specifications, D.2.2 Design Tools and Techniques, H.5.2
User Interfaces | |||
| Organizing User Interface Patterns for e-Government Applications | | BIBAK | Full-Text | 601-619 | |
| Florence Pontico; Marco Winckler; Quentin Limbourg | |||
| The design of usable interactive systems is a complex task that requires
knowledge and expertise on human factors and on software development. Usability
guidelines and design patterns may be one way to alleviate the lack of
expertise on usability of development teams by providing guidance to solve
every designer's problem when designing and developing User Interface. However,
the utility of guidelines and design patterns relays on two main issues: a) the
quality of the advices provided, and b) the way they are organized allowing
fast access to the appropriate solutions. In this paper we discuss the
organization of usability guidelines and patterns at the light of an industrial
project at SmalS-MvM devoted to the development of e-Government applications in
a very large scale. This paper presents not only a proposal of patterns
organization but also it describes a set of analysis patterns identified for
e-Government applications. Keywords: Usability guidelines organization; design patterns; User Interface design
process; e-Government applications | |||
| Including Heterogeneous Web Accessibility Guidelines in the Development Process | | BIBA | Full-Text | 620-637 | |
| Myriam Arrue; Markel Vigo; Julio Abascal | |||
| The use of web applications has extremely increased in the last few years. However, some groups of users may experience difficulties when accessing them. Many different sets of accessibility guidelines have been developed in order to improve the quality of web interfaces. Some of them are of general purpose whereas others are specific for user, application or access device characteristics. The existing amount of heterogeneous accessibility guidelines makes it difficult to find, select and handle them in the development process. This paper proposes a flexible framework which facilitates and promotes the web accessibility awareness during all the development process. The basis of this framework is the Unified Guidelines Language (UGL), a uniform guidelines specification language developed as a result of a comprehensive study of different sets of guidelines. The main components of the framework are the guidelines management tool and the flexible evaluation module. Therefore, sharing, extending and searching for adequate accessibility guidelines as well as evaluating web accessibility according to different sets of guidelines become simpler tasks. | |||