HCI Bibliography Home | HCI Conferences | UIST Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
UIST Tables of Contents: 86888990919293949596979899

Proceedings of the 2089 ACM Symposium on User Interface Software and Technology

Fullname:Proceedings of the 1989 ACM Symposium on User Interface and Software Technology
Editors:Dan R. Olsen, Jr.
Location:Williamsburg, Virginia
Dates:1989-Nov-13 to 1989-Nov-15
Publisher:ACM
Standard No:ACM ISBN 0-89791-335-3; ACM Order Number 429892; ACM DL: Table of Contents hcibib: UIST89
Papers:21
Pages:179
  1. 3D/Gesture
  2. User Interface Structures I
  3. A Larger View of User Interface Design
  4. Automatic Layout
  5. Graphical Techniques and Constraints
  6. Panel
  7. User Interface Structures II
  8. Tool Kits

3D/Gesture

Specifying Composite Illustrations with Communicative Goals BIBAK 1-9
  Doree Duncan Seligmann; Steven Feiner
IBIS (Intent-Based Illustration System) generates illustrations automatically, guided by communicative goals. Communicative goals specify that particular properties of objects, such as their color, size, or location are to be conveyed in the illustration. IBIS is intended to be part of an interactive multimedia explanation generation system. It has access to a knowledge base that contains a collection of objects, including information about their geometric properties, material, and location. As the goals are interpreted by a rule-based control component, the system generates a precise definition of the final illustration. If IBIS determines that a set of goals cannot be satisfied in a single picture, then it attempts to create a composite illustration that has multiple viewports. For example, a composite illustration may contain a nested inset illustration showing an object in greater detail than is possible in the parent picture. Each component illustration is defined by its placement, size, viewing specification, lighting specification, and list of objects to be displayed and their graphical style.
Keywords: Image synthesis, Automated picture generation, Knowledge-based graphics
The Cognitive Coprocessor Architecture for Interactive User Interfaces BIBA 10-18
  George G. Robertson; Stuart K. Card; Jock D. Mackinlay
The graphics capabilities and speed of current hardware systems allow the exploration of 3D and animation in user interfaces, while improving the degree of interaction as well. In order to fully utilize these capabilities, new software architectures must support multiple, asynchronous, interacting agents (the Multiple Agent Problem), and support smooth interactive animation (the Animation Problem). The Cognitive Coprocessor is a new user interface architecture designed to solve these two problems, while supporting highly interactive user interfaces that have 2D and 3D animations. This architecture includes 3D Rooms, a 3D analogy to the Rooms system with Rooms Buttons extended to Interactive Objects that deal with 3D, animation, and gestures. This research is being tested in the domain of Information Visualization, which uses 2D and 3D animated artifacts to represent the structure of information. A prototype, called the Information Visualizer, has been built.
Hands-on Interaction with Virtual Environments BIBAK 19-24
  David J. Sturman; David Zeltzer; Steve Pieper
In this paper we describe the evolution of a whole-hand interface to our virtual-environment graphical system. We present a set of abstractions that can be used to implement device-independent interfaces for hand measurement devices. Some of these abstractions correspond to known logical device abstractions, while others take further advantage of the richness of expression in the human hand. We describe these abstractions in the context of their use in our development of virtual environments.
Keywords: Interactive techniques, Virtual environments, Simulation

User Interface Structures I

Separating User Interface and Functionality Using a Frame Based Data Model BIBA 25-33
  Jan Wielemaker; Anjo Anjewierden
The separation between user interface and functionality found in many screen editors is generalised to handle a data model based on frames and binary relations. This paper describes a User Interface Management System (UIMS) based on the data model. The UIMS is capable of maintaining different and simultaneous representations of the same application data objects. The functionality and user interface are implemented on top of a small object oriented programming system. This allows the UIMS to be simple and independent of the graphics software and hardware as well as the data representation used by the application programs.
Standardizing the Interface Between Applications and UIMSs BIBA 34-42
  Pedro Szekely
The user interface building blocks of any User Interface Management System (UIMS) have built-in assumptions about what information about application programs they need, and assumptions about how to get that information. The lack of a standard to represent this information leads to a proliferation of different assumptions by different building blocks, hampering changeability of the user interface and portability of applications to different sets of building blocks. This paper describes a formalism for specifying the information about applications needed by the user interface building blocks (i.e. the UIMS/Application interface) so that all building blocks share a common set of assumptions. The paper also describes a set of user interface building blocks specifically designed for these standard UIMS/Application interfaces. These building blocks can be used to produce a wide variety of user interfaces, and the interfaces can be changed without having to change the application program.
An Architecture for Expert System User Interface Design and Management BIBA 43-52
  Jonas Lowgren
From a user interface point of view, expert systems are different from applications in general in that the reasoning process of the system often defines the dialogue structure. This has several advantages, but there may also be problems due to the lack of separation between functionality and user interface. This paper investigates the possibility of treating an expert system user interface as separate from the reasoning process of the system, and the consequences thereof.
   We propose that an expert system user interface can be seen as a combination of two different structures; the surface dialogue, comprising mainly lexical and syntactical aspects, and the session discourse which represents the interaction between user and system on a discourse level. A proposed architecture for a software tool managing these two structures is presented and discussed, with particular emphasis on the session discourse manager.

A Larger View of User Interface Design

A Procedure for Evaluating Human-Computer Interface Development Tools BIB 53-61
  Deborah Hix
Improving Usability by Sharing Knowledge BIBA 62-66
  Michael D. Coble; Elizabeth G. Hetzler; Steven W. Totten
There has been great progress in the technology of improving the usability of computer tools. However, the state of the art in the user interface field is far outdistancing the state of affairs at many corporations. One reason is that the knowledge is not effectively communicated, especially in large, complex organizations.
   At McDonnell Douglas Corporation we have formed the User Interface Share Group to enhance information exchange on user interface technology. This paper discusses the motivation for the group, its formation, its value to the corporation, and some specific lessons learned.

Automatic Layout

Transformations on a Dialog Tree: Rule-Based Mapping of Content to Style BIB 67-75
  William E. Bennett; Stephen J. Boies; John D. Gould; Sharon L. Greene; Charles F. Wiecha
Scope: Automated Generation of Graphical Interfaces BIBAK 76-85
  Clifford M. Beshers; Steven K. Feiner
We describe the design and prototype implementation of Scope, a system that generates graphical user interfaces for applications programmed in C++. The programmer chooses application data objects and functions that define the capabilities of the interface. At runtime, an interface design component, implemented as a set of production system rules, transforms this semantic specification into an interface built using a window system, an associated user interface toolkit, and the hardware input devices available on the system. The rules match application requirements against a semantic description of the toolkit, selecting virtual devices for input, output, and layout. Thus, Scope uses design rules to create interfaces from high-level programming semantics that are customized both for the application and the run-time environment.
Keywords: User interface generation, User interface toolkits, Rapid prototyping
Chisel: A System for Creating Highly Interactive Screen Layouts BIBAK 86-94
  Gurminder Singh; Mark Green
The UofA User Interface Management System (UIMS) generates graphical user interfaces based on a high-level description of semantic commands supported by the application. A main part of the UIMS, called Chisel, generates the presentation component of interfaces. Chisel selects interaction techniques, determines their attributes, and places them on the screen of the display device. While doing so it is capable of considering device properties, end user's preferences, and interface designer's guidelines. The aim of this paper is to discuss in detail the design and implementation of Chisel.
Keywords: Software engineering, Design methodologies, Miscellaneous, Rapid prototyping, Computer graphics, Methodology and techniques, Interaction techniques, Design, Human factors, User interface design, User interface management systems

Graphical Techniques and Constraints

Creating Graphical Interactive Application Objects by Demonstration BIBAK 95-104
  Brad A. Myers; Brad Vander Zanden; Roger B. Dannenberg
The Lapidary user interface tool allows all pictorial aspects of programs to be specified graphically. In addition, the behavior of these objects at run-time can be specified using dialogue boxes and by demonstration. In particular, Lapidary allows the designer to draw pictures of application-specific graphical objects which will be created and maintained at run-time by the application. This includes the graphical entities that the end user will manipulate (such as the components of the picture), the feedback that shows which objects are selected (such as small boxes on the sides and corners of an object), and the dynamic feedback objects (such as hair-line boxes to show where an object is being dragged). In addition, Lapidary supports the construction and use of "widgets" (sometimes called interaction techniques or gadgets) such as menus, scroll bars, buttons and icons. Lapidary therefore supports using a pre-defined library of widgets, and defining a new library with a unique "look and feel." The run-time behavior of all these objects can be specified in a straightforward way using constraints and abstract descriptions of the interactive response to the input devices. Lapidary generalizes from the specific example pictures to allow the graphics and behaviors to be specified by demonstration.
Keywords: Software engineering, Tools and techniques, User interfaces, Computer graphics, Methodology and techniques, Human factors, User interface management systems, Interaction, Object oriented design, Direct manipulation, Interaction techniques, Programming by example
Graphical Specification of Flexible User Interface Displays BIBA 105-114
  Scott E. Hudson
This paper describes the implementation concepts behind the user interface editor of the Apogee UIMS. This editor allows many aspects of a user interface to be specified graphically without a conventional textual specification. The system supports the specification of flexible user interfaces -- ones that can adapt automatically to changes in the size of objects they present and that can adapt to specific user needs in a dynamic and responsive fashion. To serve as an implementation base for this editor, the Apogee UIMS supports an active data model based on one-way constraints. This model is implemented by a small object-oriented programming language embedded within the system.
Defining the Presentation of Application Data by a Graphical Language BIBA 115-123
  Qijing Mao; Juwei Tai
On the basis of a graphical language for defining a dynamic picture and the control actions applied to it, a system is built for developing the presentation of application data for user interfaces. This system provides user interface developers a friendly and high efficient programming environment.

Panel

Direct Manipulation or Programming: How Should We Design Interfaces? BIB 124-126
  Charles Wiecha

User Interface Structures II

A Gesture Based User Interface Prototyping System BIBA 127-132
  Roger B. Dannenberg; Dale Amon
GID, for Gestural Interface Designer, is an experimental system for prototyping gesture-based user interfaces. GID structures an interface as a collection of "controls": objects that maintain an image on the display and respond to input from pointing and gesture-sensing devices. GID includes an editor for arranging controls on the screen and saving screen layouts to a file. Once an interface is created, GID provides mechanisms for routing input to the appropriate destination objects even when input arrives in parallel from several devices. GID also provides low level feature extraction and gesture representation primitives to assist in parsing gestures.
An Event Language for Building User Interface Frameworks BIBAK 133-140
  Niels Vejrup Carlsen; Niels Jorgen Christensen; Hugh A. Tucker
Languages based on the event model are widely regarded as expressive and flexible notations for the specification of interactive graphical user interfaces. However, until now, they have only been used to specify and implement the dialogue control component of user interfaces.
   This paper presents an extension of the event model. A computable notation, the event language, based on this is used to construct a complete user interface framework. The framework forms the run-time component of a UIMS.
   The event language allows the modular construction of complex event systems. This is supported by the addition of a tagged addressing mode. Furthermore, the control structure of event handlers is extended with exception management, permitting unspecified events and thereby facilitating the use of predefined building blocks.
   A general purpose run-time framework for user interfaces has been constructed using the event language. We present the architecture of the presentation component of this framework including the window manager and the I/O model.
Keywords: Software engineering, Tools and techniques, User interfaces, Computation by abstract devices, Models of computation, Automata, Computer graphics, Methodology and techniques, Languages, Design, User interface management, User interface design, Human computer interaction, Event based languages
A Presentation Manager Based on Application Semantics BIBA 141-148
  Scott McKay; William York; Michael McMahon
We describe a system for associating the user interface entities of an application with their underlying semantic objects. The associations are classified by arranging the user interface entities in a type lattice in an object-oriented fashion. The interactive behavior of the application is described by defining application operations in terms of methods on the types in the type lattice. This scheme replaces the usual "active region" interaction model, and allows application interfaces to be specified directly in terms of the objects of the application itself. We discuss the benefits of this system and some of the difficulties we encountered.

Tool Kits

Using GELO to Visualize Software Systems BIBA 149-157
  Steven P. Reiss; Scott Meyers; Carolyn Duby
GELO is a package that supports the interactive graphical display of software systems. Its features include built-in panning and zooming, abstraction of objects too small to see, pick correlation, windowing, and scroll bars. GELO creates a hierarchy of graphical objects that correspond to the components of the structure being displayed. Five flavors of graphical objects are supported, including those for simple structures, tiled layouts, and graph-based layouts. This framework is powerful enough to handle a wide variety of graphical visualizations, and it is general enough that new object flavors can be smoothly integrated in the future.
   GELO is easy to learn and to use, and is presently employed in two software development environments. Among its current applications are a variety of visual languages, an interactive display of call graphs, an interactive display of data structures, and a graphical representation of module dependencies.
Unidraw: A Framework for Building Domain-Specific Graphical Editors BIBAK 158-167
  John M. Vlissides; Mark A. Linton
Unidraw is a framework for creating object-oriented graphical editors in domains such as technical and artistic drawing, music composition, and CAD. The Unidraw architecture simplifies the construction of these editors by providing programming abstractions that are common across domains. Unidraw defines four basic abstractions: components encapsulate the appearance and behavior of objects, tools support direct manipulation of components, commands define operations on components, and external representations define the mapping between components and a file or database. Unidraw also supports multiple views, graphical connectivity, and dataflow between components. This paper presents Unidraw and three prototype domain-specific editors we have developed with it: a schematic capture system, a user interface builder, and a drawing editor. Experience indicates a substantial reduction in implementation time and effort compared with existing tools.
Keywords: Object-oriented graphical editors, Direct manipulation user interfaces, Graphical constraints
Ensemble: A Graphical User Interface Development System for the Design and Use of Interactive Toolkits BIBAK 168-179
  Michael K. Powers
User Interface Development Systems (UIDS), as opposed to User Interface Management Systems or UI Toolkits focus on supporting the design and implementation of the user interface. This paper describes Ensemble, an experimental UIDS that begins to explore the electronic creation of interaction techniques as well as the corresponding design processes. Issues related to the impact on the components of the development system are discussed. Finally, problems with the current implementation and future directions are presented.
Keywords: User interface design tool, UIMS, Interaction techniques, User interface, Dialog model, Rapid prototyping