HCI Bibliography Home | HCI Conferences | AUIC Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
AUIC Tables of Contents: 00010203040506070809101112131415

Proceedings of AUIC'04, Australasian User Interface Conference

Fullname:Proceedings of the fifth conference on Australasian user interface -- Volume 28
Editors:Andy Cockburn
Location:Dunedin, New Zealand
Dates:2004-Jan-18 to 2004-Jan-22
Publisher:ACS
Standard No:ACM DL: Table of Contents; ISBN: 1-920682-10-4; hcibib: AUIC04
Papers:16
Pages:126
Links:Online Proceedings
Enhancing creativity with (groupware) toolkits BIBFull-Text 3-3
  Saul Greenberg
Dogs or robots: why do children see them as robotic pets rather than canine machines? BIBAFull-Text 7-14
  B. Bartlett; V. Estivill-Castro; S. Seymon
In the not too distant future Intelligent Creatures (robots, smart devices, smart vehicles, smart buildings, etc) will share the everyday living environment of human beings. It is important then to analyze the attitudes humans are to adopt for interaction with morphologically different devices, based on their appearance and behavior. In particular, these devices will become multi-modal interfaces, with computers or networks of computers, for a large and complex universe of applications. Our results show that children are quickly attached to the word 'dog' reflecting a conceptualization that robots that look like dogs (in particular SONY Aibo) are closer to living dogs than they are to other devices. By contrast, adults perceive Aibo as having stronger similarities to machines than to dogs (reflected by definitions of robot). Illustration of the characteristics structured in the definition of robot are insufficient to convince children Aibo is closer to a machine than to a dog.
Tactons: structured tactile messages for non-visual information display BIBAFull-Text 15-23
  Stephen Brewster; Lorna M. Brown
Tactile displays are now becoming available in a form that can be easily used in a user interface. This paper describes a new form of tactile output. Tactons, or tactile icons, are structured, abstract messages that can be used to communicate messages non-visually. A range of different parameters can be used for Tacton construction including: frequency, amplitude and duration of a tactile pulse, plus other parameters such as rhythm and location. Tactons have the potential to improve interaction in a range of different areas, particularly where the visual display is overloaded, limited in size or not available, such as interfaces for blind people or in mobile and wearable devices. This paper describes Tactons, the parameters used to construct them and some possible ways to design them. Examples of where Tactons might prove useful in user interfaces are given.
Revisiting 2D vs 3D implications on spatial memory BIBAFull-Text 25-31
  Andy Cockburn
Prior research has shown that the efficient use of graphical user interfaces strongly depends on human capabilities for spatial cognition. Although it is tempting to believe that moving from two- to three-dimensional user interfaces will enhance user performance through natural support for spatial memory, it remains unclear whether 3D displays provide these benefits. An experiment by Tavanti and Lind, reported at InfoVis 2001, provides the most compelling result in favour of 3D -- their participants recalled the location of letters of the alphabet more effectively when using a 3D interface than when using a 2D one. The experiment reported in this paper is based on Tavanti and Lind's, but it controls some previously uncontrolled factors. The results strongly suggest that the effectiveness of spatial memory is unaffected by the presence or absence of three-dimensional perspective effects in monocular static displays.
A knowledge management approach to user support BIBAFull-Text 33-38
  R. T. Jim Eales
This paper considers the problem of computer user support and workplace learning in general. Theoretically our work is influenced by ideas on knowledge management, expertise networks and communities of practice. Our approach seeks to tap into the powerful and situated learning potential of the collaborative support provided by colleagues. We consider that such support could be enhanced through the use of a collaborative support system. We outline our investigations into design issues, a generic model and various experiments related to the development of such a system. In particular, we emphasise the value of recorded demonstrations for representing computer-related practice. We present a number of design conclusions derived from our experiences, and warn that whereas active user participation is the essential ingredient in a support system it is perhaps the most difficult thing to achieve.
From snark to park: lessons learnt moving pervasive experiences from indoors to outdoors BIBAFull-Text 39-48
  Eric Harris; Geraldine Fitzpatrick; Yvonne Rogers; Sara Price; Ted Phelps; Cliff Randell
Pervasive technologies are increasingly being developed and used outdoors in different and innovative ways. However, designing user experiences for outdoor environments presents many different and unforeseen challenges compared with indoor settings. We report on two different projects, one held indoors and one held outdoors, that were created to explore the use of various tangible technologies and pervasive environments for extending current forms of interaction, play and learning for children. In so doing the technologies had to be designed and adapted for the different settings. Using these projects as illustrations, this paper presents a contrasting analysis between indoor and outdoor pervasive environments, by identifying particular dimensions that change according to the location.
"Powerpoint to the people": suiting the word to the audience BIBAFull-Text 49-56
  René Hexel; Chris Johnson; Bob Kummerfeld; Aaron Quigley
A computerised system supporting public presentations that are "personalised" at two levels is now possible. Firstly, the system exploits context information to adapt the large-screen projected presentation on the basis of who is in the audience. Secondly, the system makes use of the display devices of individuals in the audience: these provide an additional and complementary display with both content and presentation adapted to the individual. We describe the architecture and issues in supporting this level of personalisation.
A Web user interface for an interactive software repository BIBA 57-64
  Stuart Marshall; Robert Biddle; James Noble
Using tools aimed at promoting the reuse of existing components costs the user in the time and effort needed to install and understand the tool. These costs could counteract or subsume the benefits of reuse argued for by reuse practitioners, rendering the activity worthless. One approach to reducing these costs is to deploy the tools in an environment that the user is already familiar with, and has easy access to. We have chosen the web as just such an environment, and this choice can have a significant impact on the usability and utility of the tool. This paper discusses the difficulties that arise from our use of the web, and the manner in which we have partly overcome these difficulties.
Visualization of travel itinerary information on PDAs BIBAFull-Text 65-71
  Masood Masoodian; Daryl Budd
Conventional travel itineraries list travel related information, such as flights and hotel bookings, in a chronological order of date and time. As such the only observable relationship between different activities listed on a conventional itinerary is that the activities follow one another sequentially in time. Various graphical travel itinerary visualization systems have recently been developed to allow making references between different events on an itinerary easier. These systems rely on large computer displays for visualization of itinerary information during the pre-travel planning and preparation phase, and allow access to such information using mobile phones during the actual trip when access to a computer with large display may not be possible. We have developed a system called PATI, which allows not only access but also modification of personal travel itinerary information using Personal Digital Assistant type devices. This paper describes PATI and introduces techniques used for visualization of complex itinerary information on small displays.
Display and presence disparity in Mixed Presence Groupware BIBAFull-Text 73-82
  Anthony Tang; Michael Boyle; Saul Greenberg
Mixed Presence Groupware (MPG) supports both co-located and distributed participants working over a shared visual workspace. It does this by connecting multiple single-display groupware workspaces together through a shared data structure. Our implementation and observations of MPG systems exposes two problems. The first is display disparity, where connecting heterogeneous tabletop and vertical displays introduces issues in how one seats people around the virtual table and how one orients work artifacts. The second is presence disparity, where a participant's perception of the presence of others is markedly different depending on whether a collaborator is co-located or remote. This is likely caused by inadequate consequential communication between remote participants, which in turn disrupts group collaborative and communication dynamics. To mitigate display and presence disparity problems, we determine virtual seating positions and replace conventional telepointers with digital arm shadows that extend from a person's side of the table to their pointer location.
Delegation diagrams: visual support for the development of object-oriented designs BIBAFull-Text 83-89
  Ewan Tempero; James Noble; Robert Biddle
Developers have long used pictures to aid design activities and there has been a lot of interest in standard notations for design. We have developed delegation diagrams, a graphical notation that provides visual support for developing object-oriented designs and that makes the relationship between the requirements and the design explicit. We describe both the notation and tool support, and evaluate delegation diagrams using the cognitive dimensions of notations framework.
What makes a good user interface pattern language? BIBAFull-Text 91-100
  E. Todd; E. Kemp; C. Phillips
A developer of user interfaces (UI) should be able to employ a user interface pattern language to design acceptable user interfaces. But, what makes a good pattern language? Three types of validation were identified as requiring consideration: the validity of the individual patterns, the internal validation of the pattern language and the external validation of the pattern language. This paper investigates internal validity. A set of six tests that a developer can use to test the internal validity of a pattern language has been identified.
Rapidly prototyping Single Display Groupware through the SDGToolkit BIBAFull-Text 101-110
  Edward Tse; Saul Greenberg
Researchers in Single Display Groupware (SDG) explore how multiple users share a single display such as a computer monitor, a large wall display, or an electronic tabletop display. Yet today's personal computers are designed with the assumption that one person interacts with the display at a time. Thus researchers and programmers face considerable hurdles if they wish to develop SDG. Our solution is the SDGToolkit, a toolkit for rapidly prototyping SDG. SDGToolkit automatically captures and manages multiple mice and keyboards, and presents them to the programmer as uniquely identified input events relative to either the whole screen or a particular window. It transparently provides multiple cursors, one for each mouse. To handle orientation issues for tabletop displays (i.e., people seated across from one another), programmers can specify a participant's seating angle, which automatically rotates the cursor and translates input coordinates so the mouse behaves correctly. Finally, SDGToolkit provides an SDG-aware widget class layer that significantly eases how programmers create novel graphical components that recognize and respond to multiple inputs.
e-Ghosts: leaving virtual footprints in ubiquitous workspaces BIBAFull-Text 111-116
  Michael Vernik; Steven Johnson; Rudi Vernik
Ubiquitous workspaces are future media-rich environments that employ new forms of operating systems and services to coordinate and manage interactions between people, multiple display surfaces, information, personal devices, and workspace applications. e-Ghosts is a meta application which makes novel use of workflow to support the coordination and orchestration of briefings and demonstrations within a ubiquitous workspace. This paper describes this new class of application and discusses a range of characteristics that need to be considered in relation to their design and implementation. Managing the trade-offs between interactive and automated modes in such dynamic and complex environments is of particular concern.
Rapid visual flow: how fast is too fast? BIBAFull-Text 117-122
  Andrew Wallace; Joshua Savage; Andy Cockburn
It is becoming increasingly common for user interfaces to use zooming visual effects that automatically adapt to user actions. The MacOs X 'dock' icon panel, for instance, uses a fisheye distortion to assist users in targeting items. Another example is 'speed-dependent automatic zooming', which has been shown to improve scrolling by automatically varying zoom level with scroll speed-when scrolling fast the document is zoomed out, but when scrolling slowly the document is fully zoomed in. When implementing automatic zooming interfaces, designers must calibrate the behaviour of their zooming systems so that the visual effects allow rapid navigation without stressing the human visual system. At present, these calibrations are derived from trial and error. This paper describes an attempt to determine metrics of visual flow to answer the question "how fast is too fast"? Our main focus is on automatic zooming in document scrolling tasks. We performed an experiment to measure participants' preferred and maximum-tolerable scrolling speeds at two different magnifications. We found that magnification affected the length of time that data needed to remain on screen. We also used the data to provide estimations regarding the appropriate calibration of threshold values in speed-dependent automatic zooming systems.
Wearable microphone array as user interface BIBAFull-Text 123-126
  Yong Xu; Mingjiang Yang; Yanxin Yan; Jianfeng Chen
We are at present enabled with machine-empowered technologies. The future is certainly looking towards human-empowered technologies, which should enable mobile user with natural wearable devices along with natural-like user interfaces. This paper presents proof of concept of wearable microphone array system that could be embedded in textiles for hands-free, obtrusion-free and hassle-free operation interface with body-worn computer, which is replaced with a speech recognition engine with a PC for testing. One mode of microphone array is for user-only speech interface with the computer and another mode is to collect others voice and steer it by voice control to get maximum sensitivity. The details of microphone array systems are beyond the scope this short paper. The concepts and the preliminary promising results are presented here to register our thoughts with 'wearable computers/ user interface' community.