Advances in Personal Construct Technology | | BIB | 1-2 | |
Mildred L. G. Shaw |
The Design, Analysis and Interpretation of Repertory Grids | | BIB^{A} | 3-24 | |
Mark Easterby-Smith | |||
This paper is intended for those with some knowledge of the repertory grid technique who would like to experiment for themselves with new forms of grid. It is argued that because the technique is quite powerful and the basic principles of its design are easy to grasp there is some danger in it being used inappropriately. Inappropriate applications may be harmful both to those involved directly, and to the general reputation of the technique itself. The paper therefore surveys a range of alternatives in the design of grids, and discusses the factors that are important to consider in these cases. But even if a design has been produced which is inherently "good", any applications based on this will be of doubtful value unless prior thought has been given to the availability of analytic techniques, and to the means of interpretation of the results. Hence the paper outlines a number of approaches to the analysis of grids (both manual and computer based), and it also illustrates the possible process of interpretation in a number of cases. |
One Thing Leads to Another: A New Approach to Elicitation in the Repertory Grid Technique | | BIB^{A} | 25-38 | |
Terence R. Keen; Richard C. Bell | |||
This paper describes an interactive computer program for the elicitation of
a repertory grid. The elicitation approach adopted is unique in that it can
only be practically undertaken by computer. This represents a move from
"classical" techniques (interactive or otherwise), and enables the respondent
to be an active rather than passive participant.
The approach is claimed by the authors to be nearer to Kelly's concept of conversation than other interactive techniques. |
Education for Research: The Changing Constructs of the Postgraduate | | BIB^{A} | 39-48 | |
Estelle M. Phillips | |||
The development of research skills was investigated in case studies of seven
Ph.D. students and their supervisors. A combination of repertory grids and
interviews was used to monitor changes over time. Focus and Core analyses,
together with feedback sessions, helped to isolate specific areas of importance
to the postgraduates.
Results indicated that (a) it was necessary for the students to develop an ability to evaluate their own work; (b) the pace of this development appeared to be related to the degree to which the students were allowed to remain dependent on their supervisors; (c) their enthusiasm for their Ph.D. diminished due to the length of time they had to spend working on a single problem. In addition, it appeared that providing information from the repertory grid to the students helped them to learn from their experiences of the research training process. |
Construct Systems in Conflict | | BIB^{A} | 49-57 | |
Patrick Slater | |||
The Social Science Research Council supported a research into a technique
for measuring differences of opinion in a dispute from May 1976 to December
1977. It was found that in many cases where informants differed in their
opinions about a particular topic grids aligned by element or construct or both
could not be devised for comparing them. Each protagonist made use of his own
set of terms and had no use for the other's. The methods described by Slater
(1977) could not be applied.
A substantial modification of grid technique was devised to bring the views of both sides into a single frame where they can be compared. It is called the Dual Grid. Instead of using constructs and elements as its functions it uses complete propositions. Experimental work with dual grids has not yet been carried very far. One instance is given. |
ARGUS: A Program to Explore Intra-Personal Personalities | | BIB^{A} | 59-68 | |
Mildred L. G. Shaw; Cliff McKnight | |||
This paper is based on the idea that we each have several "personalities"
within us. An interactive computer program (ARGUS) is described which allows
the user to explore his several personalities and the relationships between
them. The program is seen as having a wide range of application, and two
particular areas are developed in the present paper: |
Construct Heterarchies | | BIB^{A} | 69-79 | |
Ranulph Glanville | |||
This paper presents a technique for deriving individual construct heterarchies, and for comparing several such without loss of sharpness in the initial act of constructing. It explains uses -- both potential and in practice. The technique is related to Kelly's Personal Construct Theory, and some of its limitations and implications for that Theory are explained. |
New Directions in the Analysis and Interactive Elicitation of Personal Construct Systems | | BIB^{A} | 81-116 | |
Brian R. Gaines; Mildred L. G. Shaw | |||
The computer elicitation and analysis of personal construct systems has become a technique of great interest and wide application in recent years. This paper takes the current state of the art as a starting point and explores further developments that are natural extensions of it. The overall objective of the work described is to develop man-computer symbiotic systems in which the computer is a truly dialectical partner to the person in forming theories and making decisions. A logical model of constructs as predicates applying to elements is used to develop a logical analysis of construct structures and this is contrasted with various distance-based clustering techniques. A grid analysis program called ENTAIL is described based on these techniques which derives a network of entailments from a grid. This is compared and contrasted with various programs for repertory grid analysis such as INGRID, FOCUS and Q-Analysis. Entailment is discussed in relation to Kelly's superordination hierarchy over constructs and preference relations over elements. The entailment analysis is extended to rating-scale data using a fuzzy semantic model. The significance of Kelly's notion of the opposite to a construct as opposed to its negation is discussed and related to other epistemological models and the role of relevance. Finally, the interactive construct elicitation program PEGASUS is considered in terms of the psychological and philosophical importance of the dialectical processes of grid elicitation and analysis, and recommendations are made about its generalization and extension based on the logical foundations described. Links are established between the work on repertory grids and that on relational data bases and expert systems. |
Subjective Multi-Criteria Decision Making | | BIB^{A} | 117-141 | |
F. Eshragh | |||
This paper outlines the principles of a new technique used in
operationalization of subjective decision making, in general, and
multi-criteria decision processes in particular. The work is based on the
psychological theory of personal constructs, introduced by George Kelly in
1955, and highlights the greater emphasis which should be placed upon personal
judgement and individual values. The principles of repertory grids are
employed as the basis for implementation of this idea.
CODEM2 -- COnversational DEcision Making -- is the interactive software tool developed in the course of this work. Operational detail of this program is exemplified through an appropriate example. |
A Statistical Aid for the Grid Administrator | | BIB^{A} | 143-150 | |
Richard C. Bell; Terence R. Keen | |||
In this paper the authors consider the problem of obtaining statistical information about a repertory grid during its elicitation. A measure of cognitive complexity, element intraclass correlation, provides the administrator of the grid with information about the change in the respondent's cognitive complexity as each additional construct is elicited and scored on the element sample. The approach is illustrated with post hoc analyses of 20 grids and shows the benefit of having such information available during the process of elicitation. |
Direct Analysis of a Repertory Grid | | BIB^{A} | 151-166 | |
Chris. Leach | |||
A new exploratory method of analysing data in the form of a repertory grid
is described. The method starts by carrying out single-link hierarchical
cluster analyses of the elements and the constructs separately. These two
marginal analyses are then used to rearrange the rows and columns of the
original grid so that similar constructs and similar elements are grouped
together. Data clusters are then identified that indicate those constructs or
groups of constructs responsible for the groupings of the elements. The data
clusters also take the form of a tree. The result of the analysis is a
rearrangement of the original grid on which the row and column marginal trees
and the data clusters may be superimposed.
The direct method presented here is based on a modification of Hartigan's (1975) joiner-scaler algorithm. It is useful for repertory grids since it emphasizes the interaction between constructs and elements, making it easier to identify unusual applications of constructs. This makes it particularly attractive in clinical settings. An added bonus is that the presentation of results is sufficiently simple to make it useful for the clinician who needs a way of identifying important structural aspects of the grid that does not depend on a detailed understanding of data analysis. The method may be applied equally well to dichotomous, ranked or rating scale versions of a repertory grid. Missing entries, which may arise as a result of a construct not being applicable to some of the elements, may also be included. |
"On Becoming a Personal Scientist," by M. L. G. Shaw | | BIB | 167-168 | |
Gordon Pask |
Q-Analysis, or a Language of Structure: An Introduction for Social Scientists, Geographers and Planners | | BIB^{A} | 169-199 | |
P. Gould | |||
The paper presents an introduction to the basic concepts of Q-analysis as a descriptive and analytical language of structure, focusing upon definitions, set relations, hierarchical structures of cover sets, geometrical backcloth and supported traffic, obstruction and eccentricities. The algebraic topological perspective points up sharply the limitations of conventional multivariate and numerical taxonomic methods. Numerous examples from a wide spectrum of enquiry in the social, behavioral, medical and planning sciences illustrate the broad applications of such a higher-order algebraic language and methodology. |
An Application of File-Comparison Algorithms to the Study of Program Editors | | BIB^{A} | 201-211 | |
P. Anandan; D. W. Embley; G. Nagy | |||
Program editing is considered in terms of a file-comparison model that formalizes the transformation of an imperfect version of a program text into an improved version by means of editing operations. With some enhancement, existing file-comparison algorithms can produce the information required for the model. These enhancements include the introduction of logical levels in files, selection of a corresponding element among alternatives, and the detection and analysis of similarity. An algorithm that incorporates these modifications is described. Illustrative of the use of the file-comparison model, a high-level editing sequence for a particular text editor and a particular editing task is automatically produced and is found to be comparable to typical user-produced editing sequences. Potential applications of the file-comparison model to the study of program editors are outlined. |
A Model of Memory with Storage Horizon Control | | BIB | 213-221 | |
Maria Nowakowska |
Semiotic Systems and Knowledge Representation | | BIB^{A} | 223-257 | |
Maria Nowakowska | |||
This paper deals with cognitive processes and knowledge representation. The
starting point is an analysis of an object, treated as a relational system and
its representation. The formalism developed is especially convenient to
capture the problems of observability and change.
After these considerations, a formal theory of semiotics is outlined, allowing a new development and unified treatments (within the paradigm of fuzzy set theory) of such topics as structure of signs and their configurations (spatial and temporal), i.e. formal representations of situations and events; structure of the cognitive representations of signs, leading to the construction of a language of semantics; and structure of verbal representations of objects (their verbal copies), the latter analysed by means of the methods of mathematical linguistics, leading, among other things, to the foundations of formal text theory. As regards verbal copies, the main stress is put on cognitive constraints on descriptions imposed by the limitations of knowledge and/or observability, and constraints imposed by the language used. The properties of verbal copies can also be fruitfully related to theories of perception and memory. The set of interrelated theories presented here, and introduced for the first time by the author, allows for a compact and unified way of treating the topics which were traditionally considered separately, as various parts of psychology, philosophy or logic. As regards formal semiotics, as sketched here, it goes far beyond the notions of Peirce, leading to a system richer than taxonomical considerations, allowing for the development of semiotic theory. |
Predictive Analysis in Sentence Comprehension: A Computer Simulation Model for Surface Structure Parsing | | BIB^{A} | 259-294 | |
C. P. Whaley | |||
Though numerous models have been proposed by linguists and computer scientists for the parsing of natural language, Kimball (1973) has outlined one of the few that takes into account the operational limitations of the human perceiver. His predictive analysis model is intuitively appealing and is supported by empirical research (e.g. Whaley, 1979). This paper presents the parsing principles suggested by Kimball, and describes a computer simulation which incorporates them. The parsing accuracy of the model is demonstrated and discussed for various sentence types. Finally, extensions to the existing model are considered including the quantification of the model's predictions. |
Dialogue Determination | | BIB^{A} | 295-304 | |
Harold Thimbleby | |||
A new term, determination, is introduced to help describe the quality of interactive systems' user interfaces. A well determining interface is neither too under-determining nor too over-determining for its user; under-determination can be brought about by excessive secrecy and over-determination by excessive authoritarianism on the part of the computer (or its programmers). The concept is used to elucidate several important aspects of effective interaction. Determination is not solely a property of system design but depends on the experience and values of the user. |
The Model's Dimensions: A Form for Argument | | BIB^{A} | 305-322 | |
Ranulph Glanville | |||
This paper explores the formal relationship between two observations of Objects, so related that one is a model of the other. Models are shown to have two dimensions with two directions, of the two types: iso- and homomorphic. The dimensions of models, made analogous to the dimensions of physics, are examined when a string of modelling processes is executed. Means of compressing such strings, using references to model dimensions and direction are shown, and the essential difference between iso- and homomorphic model is discussed, highlighting the non-model characteristics of isomorphism. Finally, the analysis is applied to the form of arguments, allowing the checking of (for instance) analogies; and the proper level for the response in an argument is revealed. |
An Approach to Inference in Approximate Reasoning | | BIB^{A} | 323-338 | |
Ronald R. Yager | |||
We investigate the problem of making inferences based on fuzzy conditional statements. We discuss a new operation for both multivalued implication and fuzzy inference. This operation is based upon the exponentiation operation. |
An Experiment Using Memorization/Reconstruction as a Measure of Programmer Ability | | BIB^{A} | 339-354 | |
Tom Di Persio; Dan Isbister; Ben Shneiderman | |||
Measuring the abilities of programmers in a classroom or organizational setting is not a trivial task. Current approaches are not always accurate or reliable. This paper describes an experiment which provides evidence that performance on a memorization/reconstruction test can be used as a measure or predictor of programmer ability. The contribution of indentation in program comprehension is also examined. |
Computers and People Book Series | | BIB | 355 | |
B. R. Gaines; D. R. Hill |
Developments in Conversation Theory -- Part 1 | | BIB^{A} | 357-411 | |
Gordon Pask | |||
This paper is the first in a series describing developments in conversation theory and related work during the past 8 years. |
A Generalized Fuzzy Relational Matrix | | BIB^{A} | 413-421 | |
J. Jantzen | |||
A general and operational way of performing binary algebraic operations on
fuzzy sets is proposed by means of a 3-dimensional fuzzy relational array.
Universes of discourse are discretized and represented by equidistant points.
A system's set of equations involving binary algebraic operations is perceived as a set of relations between dependent and independent variables of the system, leading to a computerized dynamic model of the system based on fuzzy set theory. Topologic graphs provide a means for displaying the structure of the system, and the system's graph is used as a tool in the design of the computer-model operating on fuzzy sets. The analogy between on the one hand fuzzy relational matrices and on the other hand geometrical curves, is extended to an analogy between 3-dimensional fuzzy arrays and geometrical surfaces in a space. |
Algorithmic Search in Management Decision Systems | | BIB^{A} | 423-435 | |
Laurence A. Madeo; Thomas J. Schriber | |||
Research on computer-based management decision systems has analyzed such
factors as cognitive style, organizational influences, and format of output.
This paper contributes to that set of research by proposing a combination of
human (trial-and-error) search with algorithmic search. The combination,
called guided search, is described together with an experimental setting for
testing the usefulness of guided search.
Phase I of the experiment involved decision making in the context of four decision variables. In this phase, there was no significant difference in achieved objective function value between those with human search and those with guided search. Phase II took place with 11 decision variables. The subjects with guided search obtained significantly better decisions than did the subjects with human search. Although the subjects with guided search required more central processor time than did their counterparts, they used no more elapsed time and did not enter more user input. The experimental results support the belief that in complex situations guided search can be an effective aid in decision making. |
Automated Management of Communications with Remote Systems: A Decision Analysis Approach | | BIB^{A} | 437-453 | |
Randall Steeb | |||
Management of communications between a remote system and a supervisory human operator is viewed as a recurrent, complex decision task. The selection of information for transmission to the operator is a subjective, risky decision involving many factors -- system state, operator capabilities, communications costs, and channel limitations, among others. An adaptive program has been developed which incorporates many of these factors into a decision model. The program is designed to infer the operator's decision policy by using a training algorithm based on pattern recognition techniques. Some exploratory studies of the approach are described. |
Lattice Fuzzy Logics | | BIB^{A} | 455-465 | |
Ernest A. Edmonds | |||
Although the characterizing membership functions of fuzzy sets normally have as their range the interval [0,1], it is possible for the range to be a partially ordered set. The use of lattices for this set is explored. Various forms of restricted infinite lattice are considered. Rose's logical operator for logics whose truth values form lattices are reviewed. A basis for lattice fuzzy logics, using Rose's operators, is discussed and a particular infinite lattice is proposed for use in characterizing lattice fuzzy sets. Some of the concepts are used to extend edge detection techniques in image processing from grey scale to colour images. |
"Fuzzy Automata and Decision Processes," edited by M. M. Gupta, G. N. Saridis and B. R. Gaines | | BIB | 467-468 | |
Ronald R. Yager |
"Man/Computer Communication," edited by B. Shackel | | BIB | 469-471 | |
H. T. Smith |