HCI Bibliography Home | HCI Journals | About IJHCS | Journal Info | IJHCS Journal Volumes | Detailed Records | RefWorks | EndNote | Hide Abstracts
IJHCS Tables of Contents: 404142434445464748495051525354

International Journal of Human-Computer Studies 44

Editors:B. R. Gaines
Dates:1996
Volume:44
Publisher:Academic Press
Standard No:ISSN 0020-7373; TA 167 A1 I5
Papers:39
Links:Table of Contents
  1. IJHCS 1996 Volume 44 Issue 1
  2. IJHCS 1996 Volume 44 Issue 2
  3. IJHCS 1996 Volume 44 Issue 3/4
  4. IJHCS 1996 Volume 44 Issue 5
  5. IJHCS 1996 Volume 44 Issue 6

IJHCS 1996 Volume 44 Issue 1

The Influence of the User Interface on Solving Well- and Ill-Defined Problems BIBA 1-18
  Sissel Guttormsen Schar
It is well documented that people can learn in two different modes. An explicit mode is recognized by a conscious and selective attention towards the problem, while in the implicit mode, the learning is more unconscious and trial and error based. According to Hayes and Broadbent the success of employing each of these modes depends on characteristics of the problem. By applying the theory to human-computer interaction the users of conversational interfaces are predicted to be induced to an explicit learning mode, while users of direct manipulation interfaces would be induced to an implicit mode. The ability of the user interface to induce a learning mode makes it important to identify problems which can be optimally solved with each of the interfaces. Two experiments were designed to test if problems characterized as well- or ill-defined can predict which of the interfaces would be more successful. Well-defined problems were expected to be better solved by a conversational interface, and ill-defined problems with direct manipulation interfaces. The experiments demonstrate that the interfaces can induce the two different learning modes as predicted. The results only partly support that problem definition can predict which interface would be more successful. The problem definition concept is therefore not recommended as a criterion to select interface.
Acquiring Intersentential Explanatory Connections in Expository Texts BIBA 19-44
  Fernando Gomez
A taxonomy of explanatory links connecting sentences in expository texts is presented. It is also shown that there are two types of knowledge, which we have called analytical and empirical knowledge in analogy to the distinction between analytical and empirical sentences, that allow us to find and learn the explanatory connections. The role that the notion of analyticity plays in learning explanatory connections from expository texts is also emphasized. A program that embodies the ideas and that finds and learns explanatory connections in expository texts is also briefly explained.
Layered Protocols: Hands-On Experience BIBA 45-72
  J. H. Eggen; R. Kaakma; J. H. D. M. Westerink
An assessment is presented of the benefits and limitations of the Layered Protocols (LP) model for the analysis and design of user interfaces in the field of consumer electronics. In the assessment a user interface of an existing digital audio recorder which is only partly in line with the LP model is compared with an interface designed according to the model. The observed differences in usability between the two interfaces are mainly caused by deviations from the LP model. It turned out that especially the learnability of an interface is positively influenced by a layered organization of user-system interaction in combination with high-quality E- and I-feedback and optimum similarity between interaction protocols.
A Vision-Based 3-D Mouse BIBA 73-91
  P. Nesi; A. Del Bimbo
Recently, several devices for improving quality and efficiency human-computer interaction have been proposed. This is mainly due to the requirements of virtual reality applications, where the greatest part of the most circulating devices must be worn by the user -- e.g. gloves for hand-tracking, complete or partial tracksuits. These techniques allow the estimation of most human body movements. On the other hand, the adoption of such devices is unsuitable for many applications since they are too constrictive -- e.g. teleconferencing. For this reason, other simpler devices such as 3-D mice have been suggested. Most of the 3-D mice proposed in the literature compel the user to be connected to the device and their behaviour is much closer to a Joystick rather than to a true 3-D mouse. More recently, vision-based hand tracking systems have been presented, but most of them constrain the user to wear gloves, bracelets or to mark some points on the hand. In this paper, a vision-based, glove- and mark-free hand tracking system is proposed. This is based on stereo vision and can be implemented with low-cost architectures owing to its low computational complexity. In addition, the estimation process defined is very robust and allows the simultaneous hand-position tracking and the recognition of some hand postures, thus implementing a true non-constrictive 3-D mouse.
Evaluating System Design Features BIBA 93-118
  Geoffrey S. Hubona; James E. Blanton
Numerous recent studies have utilized perceived usefulness and perceived ease of use measures to predict the user acceptance of information technologies. This study presents the results of an experiment in which subjects assess the usefulness and ease of use of qualitatively distinct user interface features. Using structural equation modeling, the perceived usefulness and perceived ease of use of the user interface are empirically linked to observed measures of task accuracy and task latency, and to self-reported measures of user confidence in decision quality. Findings indicate that perceived ease of use contributes significantly to enhanced levels of user confidence in decision quality. Moreover, user interfaces perceived as easy to use are associated with faster and more accurate decisions. However, interfaces that are perceived as more useful in accomplishing the task paradoxically result in users taking longer to make decisions that are no more accurate. Furthermore, user confidence in decision quality is not promoted by interfaces that are perceived as more useful.
Bulletin BIB 119-122
 

IJHCS 1996 Volume 44 Issue 2

Special Issue on Verification and Validation

Editorial BIBA 123-125
  Robert Plant; Alun D. Preece
As for any software, users of knowledge-based systems (KBS) need to know that they can rely on the system to do its job properly. Assuring the reliability of knowledge-based systems has become an important issue in the development of the knowledge engineering discipline. The processes employed directly to assure the reliability of software are called verification and validation (V & V). Roughly speaking, validation is the process of determining if a KBS meets its users' requirements; verification is the process of determining if a KBS has been constructed to comply with certain formally-specified properties, such as consistency and irredundancy. Implicitly, validation includes verification.
   Verification and validation techniques for KBS have been discussed and debated in workshops at many of the predominant artificial intelligence conferences in recent years. The purpose of this special issue is to provide "snapshots" of the current state of the V & V area for KBS, by collecting together representative works from three of the most recent workshops:
  • at IJCAI-93 in Chambery, France (Chairman: Marc Ayel, Universite de Savoie,
       France);
  • at AAAI-94 in Seattle, USA (Chairman: Robert Plant, co-editor of this issue);
  • at ECAI-94 in Amsterdam, The Netherlands (Chairman: Alun Preece, co-editor of
       this issue). These workshops succeeded in highlighting many of the significant issues and trends within their area of concern. These issues and trends are reflected in the articles selected for this issue, the authors of which have expanded and updated their original workshop papers. The purpose of this introduction is to highlight some of the issues and trends in KBS V & V, to put this collection in its context.
  • On the Validation and Verification of Production Systems: A Graph Reduction Approach BIBA 127-144
      Stephen Murrell; Robert Plant
    This paper takes a parallel processing approach to the implementation of rule-based systems using a graph-reduction architecture, and investigates the consequences of this architecture in relation to the validation and verification of knowledge-based systems. The paper improves on the traditional sequential approaches to the development of knowledge-based systems and the limited validation and verification techniques that are applicable. This is contrasted with a graph reduction implementation of knowledge-based systems development based on an ALICE-like machine. The advantages of this style of programming in relation to systems development and program correctness are discussed. The paper shows that significant benefits could potentially be achieved through the use of graph-reduction techniques in the development of these systems.
    Validating Dynamic Properties of Rule-Based Systems BIBA 145-169
      Alun D. Preece; Cliff Grossner; T. Radhakrishnan
    Rule-based systems can be viewed as possessing two sets of properties: static and dynamic. Static properties are those that can be evaluated without executing the system, and dynamic properties can be evaluated only by examining how the system operates at run time. The dynamic properties of a rule-based system have been largely neglected in validation and verification work done thus far. Structural verification and static testing techniques do not yield information on how a rule-based system achieves its goals at run-time, given a set of input data. This paper presents a model for the relationship between the goal states achieved by a rule-based system, the set of inter-related rules that must fire to achieve each goal state, and the data items required for the rules in the rule sequence to fire. Then, we describe a method for applying this model to study the dynamic properties of a rule-based system. It is demonstrated that this model permits the validation of dynamic properties of a rule-based system, enabling system developers to decide: (1) if the manner in which the system pursues goals is valid according to the specifications (and expectations) of the designers; (2) what relationship exists between the quality of system output for a given test case and the goals achieved during problem-solving on that test case; and (3) how the overall problem-solving activity of the system relates to the availability of input data.
    The Relationship between Errors and Size in Knowledge-Based Systems BIBA 171-185
      Daniel E. O'Leary
    Previous researchers in knowledge-based systems verification have concentrated on developing various approaches and computational tools to find errors in knowledge bases. However, unlike software engineering for traditional systems, there has been little investigation of the relationship between errors and system size. In addition, there has been little analysis of the relationship between the occurrence of different types of errors. Thus, this paper investigates the empirical relationships between knowledge-based system size and number of errors, and between the number and existence of different kinds of errors.
       It is found that, in general, system size is statistically significantly correlated with two of those error types, and with total errors. Further it is found that the size of "smaller" systems is not correlated with total number of errors, but the size of "larger" systems is correlated with total number of errors. As a result, this evidence indicates that it can be important to use a modular approach in the development of such systems. In addition, it is found that the number of different types of errors have a statistically significant correlations with each other. Further, the existence of different errors types are statistically related. As a result, errors signal the existence of other errors.
    Structure-Preserving Specification Languages for Knowledge-Based Systems BIBA 187-212
      Frank van Harmelen; Manfred Aben
    Much of the work on validation and verification of knowledge based systems (KBSs) has been done in terms of implementation languages (mostly rule-based languages). Recent papers have argued that it is advantageous to do validation and verification in terms of a more abstract and formal specification of the system. However, constructing such formal specifications is a difficult task.
       This paper proposes the use of formal specification languages for KBS-development that are closely based on the structure of informal knowledge-models. The use of such formal languages has as advantages that (i) we can give strong support for the construction of a formal specification, namely on the basis of the informal description of the system; and (ii) we can use the structural correspondence to verify that the formal specification does indeed capture the informally stated requirements.
    A Methodology to Incorporate Formal Methods in Hybrid KBS Verification BIBA 213-244
      R. F. Gamble; D. M. Baughman
    There is an increasing need for hybrid knowledge based systems (KBS) that accommodate more complex AI applications. We consider a hybrid KBS to be one that combines object-oriented, frame-based, and rule-based paradigms. Building hybrid KBSs requires formal methods for verification due to the possible interactions between objects, inheritance, methods, and rules. The necessary complex verification for this type of KBS can be achieved with formal specification and derivation because they provide unambiguous specifications and proof mechanisms. However, the complete use of formal methods is often too time consuming and hence, impractical. Therefore, it is necessary to find a balance between using formal methods and traditional KBS techniques for development and verification. This paper presents a practical methodology to construct verifiable hybrid KBSs from formal specifications that follows traditional KBS development techniques. The methodology involves prototyping a partial specification expressed in a formal language into an incrementally verifiable KBS. The prototype is subjected to automated analysis and expert examination to suggest modifications and refinements to the partial specification. The transformation of the partial specification to a prototype is consistent with the refinement and transformation strategies found in formal program derivation. Thus, the prototype is verifiable. The cycle of specification, prototype, and analysis continues until the specification is believed to be sufficiently complete to fully implement the KBS. A combination of two formal specification languages is examined within the methodology to allow for more flexibility in the specification. We illustrate the methodology in an example hybrid KBS for oil reservoir characterization.
    Refinement Complements Verification and Validation BIBA 245-256
      Susan Craw
    Knowledge based systems are being applied in ever increasing numbers. The development of knowledge acquisition tools has eased the "Knowledge Acquisition Bottleneck". More recently there has been a demand for mechanisms to assure the quality of knowledge based systems. Checking the contents of the knowledge base and the performance of the knowledge based systems at various stages throughout its life cycle is an important component of quality assurance. Hence, the demand now is for verification and validation tools. However, traditionally, verification and validation have identified possible faults in the knowledge base. In contrast, this paper advocates the use of knowledge refinement to correct identified faults in parallel with the ongoing verification and validation, thus easing the progress towards correct knowledge based systems. An automated refinement tool is described which uses the output from verification and validation tools to assemble evidence from which the refinement process can propose repairs. It is hoped that automated refinement in parallel with validation and verification may ease the "Knowledge V&V Bottleneck".
    Verification and Validation with Ripple-Down Rules BIBA 257-269
      Byeong Ho Kang; Windy Gambetta; Paul Compton
    Verification to ensure a system's consistency and validation to meet the user's criteria are essential elements in developing knowledge-based systems for real world use. The normal practice is that there will be initial knowledge acquisition attempting to build a complete system which will (should) then be verified and validated. There may be a cycle through these steps till the system is complete. Maintenance is seen as a minor problem requiring the occasional repetition of the three stage process. The implicit assumption is that an expert has complete knowledge and that by a suitable knowledge acquisition process this is acquired. In fact, it seems rather than experts are incapable of recounting how they reach a conclusion. Rather, when asked a question they justify that their conclusion is correct and their justification is tailored to the specific context of the inquiry. Experts are best at justifying why one conclusion is to be preferred over another.
       This leads to a knowledge acquisition methodology, Ripple-down Rules, in which the knowledge base undergoes on-going development based on correcting errors. Each new correction or justification is considered only in the context of the same mistake being made. The method also constrains the expert's choices to ensure that any new knowledge added is valid while the knowledge base structure ensures the knowledge is verified. Verification and validation are not separate tasks, but constraints on knowledge acquisition which itself continues throughout the life of the system. This provides a closer match with the normal evolution of human knowledge and expertise. The overall approach has itself been validated by the development of a large medical expert system and through simulation studies. The medical system has been developed while in routine use and has only involved experts without any knowledge engineering support or skill in its development.
    Bulletin BIB 271-274
     

    IJHCS 1996 Volume 44 Issue 3/4

    The Sisyphus-VT Initiative

    Editorial BIB 275-280
      A. Th. Schreiber; W. P. Birmingham
    Implementing the Sisyphus-93 Task Using Soar/TAQL BIBA 281-301
      Gregg R. Yost
    Systems can be implemented in the Soar problem-solving architecture by using a stratified development approach. First, the developer describes the structure of the task knowledge at the knowledge level. Next, the developer maps that knowledge into abstract components of Soar's problem-space computational model. Finally, the developer encodes the problem-space model in an executable representation. The transformations at each stage preserve the original knowledge-level structure of the task. This paper describes how this approach yielded a working prototype implementation of the Sisyphus-93 elevator-configuration task in less than a week.
    Reusable Ontologies, Knowledge-Acquisition Tools, and Performance Systems: PROTEGE-II Solutions to Sisyphus-2 BIBA 303-332
      Thomas E. Rothenfluh; John H. Gennari; Henrik Eriksson; Angel R. Puerta; Samson W. Tu; Mark A. Musen
    This paper describes how we applied the PROTEGE-II architecture to build a knowledge-based system that configures elevators. The elevator-configuration task was solved originally with a system that employed the propose-and-revise problem-solving method (VT). A variant of this task, here named the Sisyphus-2 problem, is used by the knowledge-acquisition community for comparative studies. PROTEGE-II is a knowledge-engineering environment that focuses on the use of reusable ontologies and problem-solving methods to generate task-specific knowledge-acquisition tools and executable problem solvers. The main goal of this paper is to describe in detail how we used PROTEGE-II to model the elevator-configuration task. This description provides a starting point for comparison with other frameworks that use abstract problem-solving methods.
       Beginning with the textual description of the elevator-configuration task, we analysed the domain knowledge with respect to PROTEGE-II's main goal: to build domain-specific knowledge-acquisition tools. We used PROTEGE-II's suite of tools to construct a knowledge-based system, called ELVIS, that includes a reusable domain ontology, a knowledge-acquisition tool, and a propose-and-revise problem-solving method that is optimized to solve the elevator-configuration task. We entered domain-specific knowledge about elevator configuration into the knowledge base with the help of a task-specific knowledge-acquisition tool that PROTEGE-II generated from the ontologies. After we constructed mapping relations to connect the knowledge base with the method's code, the final executable problem solver solved the test case provided with the Sisyphus-2 material. We have found that the development of ELVIS has afforded a valuable test case for evaluating PROTEGE-II's suite of system-building tools. Only projects based on reasonably large problems, such as the Sisyphus-2 task, will allow us to improve the design of PROTEGE-II and its ability to produce reusable components.
    Solving VT in VITAL: A Study in Model Construction and Knowledge Reuse BIBA 333-371
      Enrico Motta; Arthur Stutt; Zdenek Zdrahal; Kieron O'Hara; Nigel Shadbolt
    In this paper we discuss a solution to the Sisyphus II elevator design problem developed using the VITAL approach to structured knowledge-based system development. In particular we illustrate in detail the process by which an initial model of Propose & Revise problem solving was constructed using a generative grammar of model fragments and then refined and operationalized in the VITAL operational conceptual modelling language (OCML). In the paper we also discuss in detail the properties of a particular Propose & Revise architecture, called "Complete-Model-then-Revise", and we show that it compares favourably in terms of competence with alternative Propose & Revise models. Moreover, using as an example the VT domain ontology provided as part of the Sisyphus II task, we critically examine the issues affecting the development of reusable ontologies. Finally, we discuss the performance of our problem solver and we show how we can use machine learning techniques to uncover additional strategic knowledge not present in the VT domain.
    Sisyphus-VT: A CommonKADS Solution BIBA 373-402
      A. Th. Schreiber; P. Terpstra
    This article represents a CommonKADS contribution to the Sisyphus-VT experiment. This experiment is concerned with a comparison of knowledge modelling approaches. The data set for this experiment describes the knowledge for designing an elevator system. The ultimate goal is to arrive at standards for sharing and reusing problem-solving methods and related ontologies.
    Solving VT by Reuse BIBA 403-433
      Jay T. Runkel; William P. Birmingham; Alan Balkany
    This paper describes how the Domain-Independent Design System (DIDS) was used to solve the VT Sisyphus task. DIDS enables knowledge engineers to rapidly build knowledge systems by providing libraries of reusable problem-solving procedures, called mechanisms, and reusable knowledge-acquisition procedures, called mechanisms for knowledge acquisition (MeKAs). Instead of writing code, engineers build systems by selecting and combining elements from these libraries. Therefore, DIDS-developed systems are easy to build and are not brittle, i.e. they can be easily modified by replacing mechanisms and MeKAs. To demonstrate these ideas, DIDS was used to solve the VT task by starting with the DIDS solution to the Sisyphus room-assignment problem. A problem solver and knowledge-acquisition tool for the VT task were created by replacing a few of the mechanisms and MeKAs used for the room-assignment task with those better suited to the VT task.
    Combining KARL and CRLM for Designing Vertical Transportation Systems BIBA 435-467
      Karsten Poeck; Dieter Fensel; Dieter Landes; Jurgen Angele
    This paper describes a solution to the Sisyphus-II elevator-design problem by combining the formal specification language KARL and the configurable role-limiting shell approach. A knowledge-based system configuring elevator systems is specified and implemented. First, the knowledge is described in a graphical and semi-formal manner influenced by the KADS models of expertise. A formal description is then gained by supplementing the semi-formal description with formal specifications which add a new level of precision and uniqueness. Finally, a generic shell for propose-and-revise systems is designed and implemented as the realization of the final system. This shell was derived by adapting the shellbox COKE, also used for the previous Sisyphus office-assignment problem. As a result of this integration, we get a description of the knowledge-based system at different levels corresponding to the different activities of its development process.
    Modelling an Elevator Design Task in DESIRE: The VT Example BIBA 469-520
      Frances M. T. Brazier; Pieter H. G. van Langen; Jan Treur; Niek J. E. Wijngaards; Mark Willems
    An elevator configuration task, the VT task, is modelled within DESIRE as a design task. DESIRE is a framework within which complex reasoning tasks are modelled as compositional architectures. Compositional architectures are based on a task decomposition, acquired during task analysis. An existing generic task model of design, based on a logical analysis and synthesis of task models devised for diverse applications, has been refined for the elevator configuration task. The resulting task model includes a description of the ontology of the elevator domain and a description of the task model.
    Configuring Elevator Systems BIBA 521-568
      Gregg R. Yost; Thomas R. Rothenfluh
    To configure an elevator system, one must assemble a collection of components that satisfies both customer demands and safety regulations. Complex interactions among elevator system components complicate the configuration process. Not all components are compatible, and certain combinations will not meet functional or safety requirements. This document describes how a configuration engineer configures elevator systems. It describes what initial information the customer must supply, how to use this information to assemble a safe and functional elevator system, and what information must finally be supplied to the installers and inspectors. Also listed are inter-component constraints and design changes that can be made to remedy constraint violations.
    The Configuration Design Ontologies and the VT Elevator Domain Theory BIBA 569-598
      T. R. Gruber; G. R. Olsen; J. Runkel
    In the VT/Sisyphus experiment, a set of problem solving systems were being built against a common specification of a problem. An important hypothesis was that the specification could be given, in large part, as a common ontology. This article is that ontology. This ontology is different than normal software specification documents in two fundamental ways. First, it is formal and machine readable (i.e. in the KIF/Ontolingua syntax). Second, the descriptions of the input and output of the task to be performed include domain knowledge (i.e. about elevator configuration) that characterize semantic constraints on possible solutions, rather than describing the form (data structure) of the answer. The article includes an overview of the conceptualization, excerpts from the machine-readable Ontolingua source files, and pointers to the complete ontology library available on the Internet.
    Bulletin BIB 599-602
     

    IJHCS 1996 Volume 44 Issue 5

    Systematic Building of Conceptual Classification Systems with C-KAT BIBA 603-627
      Manuel Zacklad; Dominique Fontaine
    C-KAT is a method and a tool which supports the design of "feature oriented" classification systems. During the design of these systems, one is very often confronted with the problem of the "calculation of the attribute cross-product". It arises because the examination of the dependency and compatibility relations between the attributes leads to the need to generate the cross-product of their features. The C-KAT method uses a specialized Heuristic Classification conceptual model named "classification by structural shift" which sees the classification process as the matching of different classifications of the same set of objects or situations organized around different structural principles. To manage the complexity induced by the cross-product, C-KAT supports the use of a least-commitment strategy which applies in a context of constraint-directed reasoning. The method is presented using a detailed example from the field of industrial fire insurance.
    Causal Model-Based Knowledge Acquisition Tools: Discussion of Experiments BIBA 629-652
      Jean Charlet; Chantal Reynaud; Jean-Paul Krivine
    The aim of this paper is to study causal knowledge and demonstrate how it can be used to support the knowledge acquisition process. The discussion is based on three experiments we have been involved in. First, we identify two classes of Causal Model-Based Knowledge Acquisition Tools (CMBKATs): bottom-up designed causal models and top-down designed causal models. We then go on to discuss the properties of each type of tool and how they contribute to the whole knowledge acquisition process.
    Inducing Effective Operator Control through Ecological Interface Design BIBA 653-688
      William S. Pawlak; Kim J. Vicente
    Ecological Interface Design (EID) is a theoretical framework for designing interfaces for complex human-machine systems. This article investigates the utility of EID in inducing effective real-time operator control performance during both normal and abnormal conditions. Two interfaces for a thermal-hydraulic process were compared, an EID interface based on physical and functional (P + F) system representations and a more traditional interface based solely on a physical (P) representation. Subjects were given 4 weeks of daily practice with one of the two interfaces before their performance on normal events and unfamiliar faults was evaluated. Under normal conditions, there was no performance difference between the P + F and P interfaces. However, dual task results indicate that the P interface loads more on verbal resources, whereas the P + F interface loads more on spatial resources during normal trials. Furthermore, a process tracing analysis of the fault trials showed that the P + F interface led to faster fault detection and more accurate fault diagnosis. Moreover, the P + F subjects exhibited a more sophisticated and effective set of fault management strategies that are similar to those observed in field studies of experienced operators in complex human-machine systems. In addition, a deficiency of the P +F interface was identified, suggesting a need for integrating historical information with emergent feature displays. Collectively, these findings have significant practical implications for the design of advanced computer interfaces for complex industrial systems.
    Roles of Design Knowledge in Knowledge-Based Systems BIBA 689-721
      Michel Benaroch
    Recent research suggests that the abilities of a knowledge-based system (KBS) depend in part on the amount of explicit knowledge it has about the way it is designed. This knowledge is often called design knowledge because it reflects design decisions that a KBS developer makes regarding what ontologies to embody in the system, what solution strategies to apply, what system architecture to use, etc. This paper examines one type of design knowledge pertaining to the structure underlying the solutions a KBS produces. (For example, in medical diagnosis, the output might be just a disease name, but the solution is actually a causal argument that the system implicitly constructs to find out how the disease came about.) We define this type of design knowledge, show how it can be represented, and explain how it can be used in problem solving to make the structure underlying solutions explicit. Subsequently, we also present and illustrate new avenues that the availability and use of the design knowledge discussed open with respect to the ability to build KBSs that possess strong explanation capabilities, are easier to maintain, support knowledge reuse, and offer more robustness in problem solving.
    Bulletin BIB 723-729
     

    IJHCS 1996 Volume 44 Issue 6

    The Role of Cognitive Science in Human-Computer Interaction

    Editorial: The Evolving Partnership between Cognitive Science and HCI BIBA 731-741
      Elizabeth Pollitzer; Ernest Edmonds
    The theme of this special issue is "The Role of Cognitive Science in Human-Computer Interaction" (HCI). A generally accepted definition states that the main goal of HCI is to advance the design, implementation, and use of interactive computing systems by human beings (ACM, SIGCHI, 1992). Since the current primary use of computers is as tools for acting on and for observing the (information) world, the role of cognitive science -- interpreted broadly as an endeavour to understand intelligent behaviour -- is, consequently, tied to the questions:
  • how do our interactions with computing systems affect our representations of
       the objects that we manipulate?
  • how does interactions design influence our senses and our actions?
  • how does using computers to perform tasks transforms our notions of the
       relationships that exist in the world around us?
  • A Dual-Space Model of Iteratively Deepening Exploratory Learning BIBA 743-775
      John Rieman; Richard M. Young; Andrew Howes
    When users of interactive computers must work with new software without formal training, they rely on strategies for "exploratory learning". These include trial and error, asking for help from other users, and looking for information in printed and on-line documentation. This paper describes a cognitive model of exploratory learning, which covers both trial-and-error and instruction-taking activities. The model, implemented in Soar, is grounded in empirical data of subjects in a task-oriented, trial-and-error exploratory learning situation. A key empirical finding reflected in the model is the repeated scanning of a subset of the available menu items, with increased attention to items on each successive scan. This is explained in terms of dual search spaces, the external interface and the user's internal knowledge, both of which must be tentatively explored with attention to changing costs and benefits. The model implements this dual-space search by alternating between external scanning and internal comprehension, a strategy that gradually shifts its focus to a potentially productive route through an interface. Ways in which interfaces might be designed to capitalize on this behaviour are suggested. The research demonstrates how cognitive modelling can describe behaviour of the kind discussed by theories of "situated cognition".
    Developing Practice with Theory in HCI: Applying Models of Spatial Cognition for the Design of Pictorial Databases BIBA 777-799
      M. W. Lansdale; S. A. R. Scrivener; A. Woodcock
    The design and development of large pictorial databases represents a considerable challenge to the design of effective interfaces and query mechanisms. This paper reviews a project concerned with the development of theories of spatial cognition and their application to the design of pictorial databases. The aim is to investigate the feasibility of developing query methods based upon visuo-spatial methods, and to consider the implications of this for design. The purpose of this paper is to discuss the joint enterprise of psychological experimentation and system development and to consider the impact upon each discipline of the shared aim of the project. Three main conclusions are drawn: (a) useful theories of spatial memory can be developed of general utility in the design of pictorial databases; (b) however, the analysis of tasks in which pictorial databases might be used reveals a complex picture in which the specificity of task domain and visual material is more likely to dictate issues of design than is any generic theory of visual cognition. In other words, the utility of visuo-spatial methods of database encoding and query cannot be taken for granted in pictorial databases; and finally (c) projects such as this, in which psychological knowledge is used as a motivation for design innovation, appear to represent high-risk, high-return strategies of design development.
    The Skull Beneath the Skin: Entity-Relationship Models of Information Artifacts BIBA 801-828
      T. R. G. Green; D. R. Benyon
    Data modelling reveals the internal structure of an information system, abstracting away from details of the physical representation. We show that entity-relationship modelling, a well-tried example of a data-modelling technique, can be applied to both interactive and noninteractive information artifacts in the domain of HCI. By extending the conventional ER notation slightly (to give ERMIA, Entity-Relationship Modelling for Information Artifacts) it can be used to describe differences between different representations of the same information, differences between user's conceptual models of the same device, and the structure and update requirements of distributed information in a worksystem. It also yields symbolic-level estimates of Card, Pirolli and Mackinlay's index of "cost-of-knowledge" in an information structure, plus a novel index, the "cost-of-update"; these symbolic estimates offer a useful complement to the highly detailed analyses of time costs obtainable from GOMS-like models. We conclude that, as a cheap, coarse-grained, and easy-to-learn modelling technique, ERMIA usefully fills a gap in the range of available HCI analysis techniques.
    What Does Virtual Reality NEED?: Human Factors Issues in the Design of Three-Dimensional Computer Environments BIBA 829-847
      John Wann; Mark Mon-Williams
    Virtual reality (VR) has invaded the public's awareness through a series of media articles that have promoted it as a new and exciting form of computer interaction. We discuss the extent to which VR may be a useful tool in visualization and attempt to disambiguate the use of VR as a general descriptor for any three-dimensional computer presentation. The argument is presented that, to warrant the use of the term virtual environment (VE), the display should satisfy criteria that arise from the nature of human spatial perception. It directly follows, therefore, that perceptual criteria are the foundations of an effective VE display. We address the task of making a VE system easy to navigate, traverse and engage, by examining the ways in which three-dimensional perception and perception of motion may be supported, and consider the potential conflict that may arise between depth cues. We propose that the design of VE systems must centre on the perceptual-motor capabilities of the user, in the context of the task to be undertaken, and establish what is essential, desirable and optimal in order to maximize the task gains, while minimizing the learning required to operate within three-dimensional interactive displays.
    Note: Erratum on Figure 1 in Vol. 46, No. 3, p. 379
    Inter-Personal Awareness and Synchronization: Assessing the Value of Communication Technologies BIBA 849-873
      Leon Watts; Andrew Monk; Owen Daly-Jones
    How may we discriminate between the multitude of point-to-point communication facilities currently available? To take just one aspect of communication, how can we assess the fluency of coordination that results from using some communication technology? This paper describes two groups of measures with this general purpose. The measures described have been devised to be used in a particular approach to evaluation for the design of communication systems that borrows from experimental and ethnographic methods. This approach is promoted as a practical and rigorous way of assessing design alternatives.
       The first group of measures are subjective ratings that assess someone's awareness of the attentional status of their conversational partner, such awareness is necessary for the successful coordination of conversation. The rating scales are shown to be sensitive in that they distinguish between video and audio mediated conversation in a short experiment.
       The second group are measures derived from video records of communicative behaviour using "activity set" analysis. This can be used to assess coordination in communication directly. An activity set is a mutually exclusive and exhaustive set of behavioural states. A publicly available tool, Action Recorder, makes it possible to score the tapes in near real time. "Simple statistics" are extracted from a single activity set, examples are: the proportion of time spent looking towards the video monitor and the average duration of these glances. "Contingent statistics" are extracted from two or more activity sets, for example, the proportion of time both members of a pair are looking towards their video monitors. A way of assessing the synchronization evident in two people's behaviour is presented that makes use of these contingent statistics. Inter-observer reliabilities are given for all the measures generated.
    Specifying Relations between Research and the Design of Human-Computer Interactions BIBA 875-920
      John Long
    This paper argues the need for more effective: human-computer interactions; design of such interactions; and research to support such design. More effective research for design would result in more effective human-computer interactions. One contribution to more effective research would be the specification of relations between research and the design of human-computer interactions. The aim of this paper is to propose such a specification. Frameworks for specifying relations are proposed for: disciplines; the human-computer interaction (HCI) general design problem; and validation. The frameworks are used to model, and so to specify, the relations: between HCI research and the HCI general design problem; and within the particular scope of HCI, to support HCI research. Together, the models specify the relations between HCI research and the design of human-computer interactions. Meeting these specifications renders HCI knowledge coherent, complete and "fit-for-design-purpose". An illustration of the relations, thus specified, is provided by a model of the planning and control of multiple task work in medical reception and its hypothetical application. The same frameworks are also used to specify the relations between Cognitive Science and the understanding of natural and artificial forms of intelligence. Lastly, they are further used to identify the relations not specified between Cognitive Science and the design of human-computer interactions. The absence of such relations renders Cognitive Science knowledge not coherent, complete nor "fit-for-design-purpose" (as opposed to "fit-for-understanding-purpose"). It is proposed how the relations specified for HCI and Cognitive Science might be used in the assessment of relations between other research and the design of human-computer interactions. Finally, the paper recommends that such an assessment should be undertaken by any discipline, such as Cognitive Science, which claims a relation between its research and the design of human-computer interactions. Such an assessment would establish whether or not such relations are, or can be, specified. The paper concludes that specification of relations is required for more effective research support for the design of human-computer interactions.
    Bulletin BIB 921-923