HCI Bibliography Home | HCI Conferences | AUIC Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
AUIC Tables of Contents: 00010203040506070809101112131415

Proceedings of AUIC'05, Australasian User Interface Conference

Fullname:Proceedings of the Sixth Australasian conference on User interface -- Volume 40
Editors:Mark Billinghurst; Andy Cockburn
Location:Newcastle, NSW, Australia
Dates:2005-Jan
Publisher:ACS
Standard No:ISBN: 1-920682-22-8; ACM DL: Table of Contents; hcibib: AUIC05
Papers:16
Pages:136
Links:Online Proceedings
Creative research & development collaborations BIBAFull-Text 3-3
  Jeff Jones
Around the world there are many examples of how universities and industry work together to create new knowledge and new economic value. Motivating creative, smart people to work together can be one of the hardest aspects of such endeavours, especially where various arrangements and agreements need to take the commercial interests of industry into account as well as the highly aspirational interests of individual researchers. Professor Jeff Jones, CEO and Research Director of ACID -- the Australasian CRC for Interaction Design -- will discuss some of the key challenges to establishing robust collaborative arrangements, the difficulties that arise where geographic distance is an issue, and the fantastic project teams and outcomes that are possible. Jones will also showcase some of the interaction design projects already underway. Interaction design is about finding better ways for people to interact with each other through communication technologies. Interaction design involves understanding how people, learn, work and play so that we can engineer -- better, more valuable and more appropriate technologies to the contexts of their lives. As an academic discipline, interaction design is about the people-research that underpins the development of these technologies. For ACID, interaction design is commercially focused to help people participate in the digital world. ACID is uniquely situated in the world to help Australasia take advantage of the hybrid Indigenous-Asian-European-North American design and creative industries. ACID has a defined capacity to integrate the hard-edged commercial focus of its Smart Living projects with the distinctive and responsible activities in the Virtual Heritage projects. But to make ACID truly distinctive we've combined these with technologies in Digital Media projects and the mass-distribution and social capacity development expertise emerging in the Multi-user Environments projects. This multifaceted integration of technology, methods, domain knowledge and culture provides ACID and the Australasian economy with an ultimate value differentiator and some truly sustainable advantages.
Hand tracking for low powered mobile AR user interfaces BIBAFull-Text 7-16
  Ross Smith; Wayne Piekarski; Grant Wigley
Mobile augmented reality systems use general purpose computing hardware to perform tasks such as rendering computer graphics, providing video overlay, and performing vision tracking. Our current Tinmith-Metro modelling system implements a user interface which is based on tracking the motions of gloves worn by the user, but is implemented inefficiently in a mobile laptop carried on a backpack by the user. This paper describes how we have developed a tracking algorithm which is suitable for implementation in a field programmable gate array. This implementation uses minimal power and will allow future miniaturisation of our mobile backpack equipment. We present the results of studies conducted outdoors to find the most appropriate marker type to use, and also the overall results that were achieved during testing.
Interaction with partially transparent hands and objects BIBAFull-Text 17-20
  Volkert Buchmann; Trond Nilsen; Mark Billinghurst
Visual concealment of important objects and information by hands or tools can make many tasks more difficult. To alleviate this problem, hands and tools can be made partially transparent, allowing users to see their tools and work area simultaneously. This paper describes our experience with a system that can control the level of transparency of hands, tools and objects. We describe how users performed with uniform transparency across objects and with selective transparency where important details of objects are made less transparent. We identify several perceptual issues with this interface and propose possible solutions.
Interactive mediated reality BIBAFull-Text 21-29
  Raphael Grasset; Laurence Boissieux; Jean D. Gascuel; Dieter Schmalstieg
Mediated reality describes the concept of filtering our vision of reality, typically using a head-mounted video mixing display. We can redefine this idea in a more constructive context, applying dynamic changes to the appearance and geometry of objects in a real scene using computer graphics. In this paper, we propose new tools for interactively mediated reality. After describing a new generic framework for achieving this goal, we present a prototype system for painting, grabbing and glueing together real and virtual elements. We also conducted an informal evaluation that provides an initial analysis of the level of interest, usability and current technical limitations of this approach.
Beautifying sketching-based design tool content: issues and experiences BIBAFull-Text 31-38
  Beryl Plimmer; John Grundy
With the advent of the Tablet PC and stylus-based PDAs, sketching-based user interfaces for design tools have become popular. However, a major challenge with such interfaces is the need for appropriate "beautification" of the sketches. This includes both interactive beautification as content is sketched and post-design conversion of sketches to formalised, computer-drawn diagrams. We discuss a number of beautification issues and requirements for sketching-based design tools, illustrating these with examples from two quite different sketching-based applications. We illustrate ways of supporting beautification, user interface design and implementation challenges, and results from preliminary evaluations of such interfaces.
Clicki: a framework for light-weight web-based visual applications BIBAFull-Text 39-45
  Donald Gordon; James Noble; Robert Biddle
Web application frameworks typically provide little support for graphical web applications such as diagram editors. This restricts applications developed using these frameworks to either very limited interactions, or using technologies that are only supported by a small subset of web browsers. Clicki is an object-oriented web application framework in the style of traditional graphical user interface frameworks, with minimal requirements on the client web browser. The Clicki framework enables the creation of nontraditional diagram-based web applications within the confines of the architectural constraints imposed by the web.
The semiotics of user interface redesign BIBAFull-Text 47-53
  Jennifer Ferreira; Pippin Barr; James Noble
User interface design is still more of an art than a science. Interface design and redesign is mostly based on empirical studies or prototypes but there is still surprisingly little theoretical or engineering understanding of how to go about the design process and produce good designs the first time around. We present a semiotic analysis that explains features of some user interface redesigns taken from the literature and propose that our semiotic analysis can help designers explain the changes they make to potentially help them produce user interfaces that will require less redesign.
Evaluation of two textual programming notations for children BIBAFull-Text 55-62
  Tim Wright; Andy Cockburn
Many researchers have developed many programming environments for children. Typically each of these environments contains its own programming notation ranging from computer code to animated virtual 3D robots and in some case the notation consists of physical objects. While some of these notations were created by examining how children naturally describe computer programs, little research has examined how children understand programs written using these notations. Even less research has examined how children understand programs written using multiple notations. This paper describes an evaluation that compares how children can understand computer programs written using different programming notations: conventional code, English, or a combination of the two. The children were about eleven years old and we measured speed in answering questions about computer programs and the accuracy of their answers. We found that children reading computer programs written in a conventional-style notation were more efficient (faster with no reliable difference in accuracy) than children reading programs written in English. Children with access to a combination of both notations performed between the two other conditions.
Generating web-based user interfaces for diagramming tools BIBAFull-Text 63-72
  Shuping Cao; John Grundy; John Hosking; Hermann Stoeckle; Ewan Tempero; Nianping Zhu
Thin-client diagramming tools provide a number of advantages over traditional thick-client design tools but are challenging to build. We describe an extension to a thick-client meta-tool that allows any specified diagram editor to be realised as a thin-client tool. A set of server-side components interact with the thick-client tool to generate GIF or SVG diagrams for both display and editing in a conventional web browser. We describe the motivation for our work, our novel architecture, illustrate and discuss interaction issues with the generated diagrams, and describe evaluations of the effectiveness of our approach.
Program comprehension: investigating the effects of naming style and documentation BIBAFull-Text 73-78
  Scott Blinman; Andy Cockburn
In both commercial and academic environments, software development frameworks are an important tool in the construction of industrial strength software solutions. Despite the role they play in present day software development, little research has gone into understanding which aspects of their design, influence the way software developers use frameworks at the source code level. This paper investigates how the comprehensibility of an application's source code is affected by two factors: the naming styles for framework interfaces, and the availability of interface documentation. Results show that using a descriptive interface naming style is an effective way to aid a developer's comprehension. Documentation also plays an important role, but it increases the amount of time a developer will spend studying the source code.
Outdoor augmented reality gaming on five dollars a day BIBAFull-Text 79-88
  Benjamin Avery; Bruce H. Thomas; Joe Velikovsky; Wayne Piekarski
The latest hardware available for creating playable augmented reality games is too expensive to be used in consumer-level products at the current time. Low-end hardware exists that is much cheaper, but the performance of these hardware components seems to be inadequate for use in AR games in their current state. This paper discusses the hardware options available for a consumer-level AR gaming system, as well as the rationale behind selecting an appropriate sensor technology, head mounted displays, power sources, and controllers. We also outline the limitations presented by these hardware components and how they affect the possible game play. This paper presents a novel set of game design techniques and software solutions to overcome many of the hardware's limitations, and allows for games to be created that do not require more expensive high-end hardware platforms.
Lightweight user interfaces for watch based displays BIBAFull-Text 89-98
  Peter Hutterer; Mark T. Smith; Bruce H. Thomas; Wayne Piekarski; John Ankcorn
Ubiquitous mobile computing devices offer the opportunity to provide easy access to a rich set of information sources. Placing the display for this computing device on the user's wrist allows for quick, easy, and pervasive access to this information. In this paper we describe a user interface model and a set of five applications we have developed, with the aim of providing a user interface that supports lightweight interactions. Our goal is to make our pervasive watch as simple to use as a common wrist-watch worn today.
Real-time 3D finger pointing for an augmented desk BIBAFull-Text 99-108
  Le Song; Masahiro Takatsuka
The augmented desk is gaining popularity in recent HCI research. Its layout of a large horizontal screen on the desk enhances immersive and intense collaborative experiences. A responsive and unimpeded input interface is important for an efficient interaction in such an environment. In this paper, we developed a real-time stereo vision-based finger pointing interface for our augmented desk that supports drag-and-drop operation by the tap-and-move of the index finger. The core of our system is a 3D fingertip tracking system, which requires both careful calibration and efficient fingertip localization algorithm. To meet these requirements, we have designed a two-step calibration method that strikes a good balance between accuracy and convenience. Furthermore, based on the chain code representation of the contour, we propose the direction cancellation vector as a tool for fingertip localization. Our algorithm works efficiently with a time complexity of O(n) in term of the length of the chain code. Currently our system allows a user to select and move the displayed contents on the screen directly using his fingertip, and it is applied to the interactive graph drawing paradigm proposed by do Nascimento and Eades (2001). In this application, our real-time pointing interface enables the user to interact with a graph drawing program dynamically, which results in optimal layouts of graphs with maximum symmetry. In the last section, the strength and weakness of our system are discussed, and further suggestions to improve the system are also given.
A taxonomic analysis of user-interface metaphors in the Microsoft Office Project Gallery BIBAFull-Text 109-117
  Pippin Barr; Rilla Khaled; James Noble; Robert Biddle
User-interface metaphors are not well understood in terms of their relationships and qualities. They are, however, constantly used both consciously and unconsciously in most user-interface designs. This paper demonstrates a taxonomic classification and analysis of the user-interface metaphors in the Microsoft Office Project Gallery. The classification offers insight both into the nature of the metaphors in the user-interface investigated and more generally, and helps validate the use of the taxonomy as an assessment tool.
Aiding text entry of foreign alphabets with visual keyboard plus BIBAFull-Text 119-125
  Lara Rennie; Andy Cockburn
Computer keyboards are used to input hundreds of different languages using many different alphabets. Despite this diversity, the physical layout of keyboards is fairly uniform, with keyboards generally containing approximately 80 keys spread across six rows (excluding cursor keys and numberpad). In English speaking countries, the QWERTY layout is the de-facto standard binding between the physical location of keys and the corresponding letters of the alphabet. To aid international and multi-lingual computer use, operating systems allow users to alter bindings between physical keys and resultant characters, but this raises a problem for users as the labels on the physical keys will not match those of the bindings. Software user interfaces such as Microsoft's Visual Keyboard (MVK) help users by providing a visual depiction of the keyboard's new bindings, but users still suffer an overhead in establishing the mapping between the physical and displayed keys. This paper describes a comparative analysis and empirical evaluation of three alternative techniques for helping users input non-standard alphabets using a standard keyboard. In particular we investigate whether our VKPLUS (Visual Keyboard Plus) user interface, which displays both the physical key labels and the new keybindings, improves text entry rates over Microsoft's Visual Keyboard. The third technique, included for baseline comparison, uses sticky-labels placed over the physical keyboard. Results show that VKPLUS improves performance over Microsoft's system.
Newly-discovered group awareness mechanisms for supporting real-time collaborative authoring BIBAFull-Text 127-136
  Gitesh K. Raikundalia; Hao Lan Zhang
Group awareness has become important in improving the usability of real-time, distributed, collaborative writing systems. However, the current set of implemented awareness mechanisms is insufficient in providing extensive and comprehensive awareness in collaborative authoring. Certainly, current mechanisms, such as telepointers and multi-user scrollbars, have contributed greatly in providing awareness support in collaborative writing. Yet, given the shortcomings of these mechanisms and the difficulty in providing rich interaction found in face-to-face collaboration, much more support needs to be provided for group awareness during authoring. This research extends the pool of all known awareness mechanisms (including those that have been discovered before but have yet to be implemented). This research discovered several awareness mechanisms not found and reported elsewhere, through conducting usability experiments with a real-time cooperative editor. This paper covers three of the mechanisms--Task Allocation Tree, User Action List and User-based History Tracking--discovered from the experiments. The paper also provides quantitative results supporting implementation of such mechanisms.