| On beyond GUI | | BIB | 3 | |
| S. Feiner | |||
| Roller interface for mobile device applications | | BIBA | Full-Text | PDF | 7-13 | |
| Lijian Wang; A. S. M. Sajeev | |||
| Mobile devices generally have small screens. To display information that is well suited for larger screens (e.g. desktop computers), the information has to be segmented into many small presentation units that can fit into the small screen of mobile devices. This makes it difficult to effectively organize information and help users navigate to and from the information they want. This paper presents Roller, a new interface technique to present application interface on mobile devices. Roller helps alleviate the screen real estate limitations in user-interface design and provides rich contextual information to ease users' navigation tasks. Our primary trial study shows the effectiveness of using Roller to organize and present information on small screens. | |||
| System usability evaluation for input operation using oculo-motors | | BIBA | Full-Text | PDF | 15-22 | |
| Minoru Nakayama; Makoto Katsukura | |||
| This paper investigates the relationship between oculomotors, which consist
of eye-movement and pupillary change, and the traditional subjective index for
"usability", to determine the possibility of evaluating Human-Computer
Interaction (HCI). An evaluation experiment was conducted by operating a target
on a computer display using input devices: mouse, keyboard and key pad. The
results show there is a significant correlated relationship between the pupil
size and the SU-score, which is an established subjective evaluation index for
system usability. These results provide evidence that pupil size can be used as
an index of the system's usability, and also that the SU-score can be estimated
from the pupil size. The indices of eye-movement, which consist of saccade
frequency, saccade length and saccade time, indicate characteristics of the
input operation behavior. These two results suggest that pupil size and index
of eye-movement as oculo-motor indices, can provide information about a
system's overall usability regarding the input operation task. Additionally,
these indices show stability even during short observation periods. This
suggests that it is possible to observe temporal changes of system usability.
The results provide evidence that oculo-motors can be an index of system usability. | |||
| Game/music interaction: an aural interface for immersive interactive environments | | BIBA | Full-Text | PDF | 23-26 | |
| Chris Nelson; Burkhard C. Wünsche | |||
| Game music has the potential to be much more than a passive element of the background. The music can and should affect the game play. The game play can and should affect the music. The player's actions can and should influence the direction and evolution of the music. By tightly linking game play and music, the player becomes much more immersed in the experience, and new creative possibilities abound for the developer. This paper presents a framework for linking game play to music by analysing music on a perceptual and a physical level. Real-time processing is achieved by using the GPU as an APU. The usefulness of the framework is demonstrated by two examples of game play synchronised with auditory perceptions. It is our hope that this paper will enable and stimulate the reader to create musically interactive games, and discovering entirely new ideas in this field. | |||
| Modeling reach for use in user interface design | | BIBA | Full-Text | PDF | 27-30 | |
| Aaron P. Toney; Bruce H. Thomas | |||
| The area on a horizontal working plane usable for direct manipulation or direct touch user interfaces is constrained within the space reachable by the users. This paper shows that existing models of reach in the literature are suitable for use in user interface design. While existing data was gathered for stationary individuals this paper examines the impact of freedom of motion on the maximum reported comfortable reach envelope (i.e. the surface of maximum reach). A user study was conducted to gauge the impact at several table heights of user motion on available working space. Throughout, the paper discuses several ways in which these reach models can be immediately applied to user interface design. | |||
| CONFER: towards groupware for building consensus in collaborative software engineering | | BIBA | Full-Text | PDF | 31-38 | |
| Felix Wong; George Fernandez; Jim McGovern | |||
| Distributed computing technology allows software engineering teams to work
across different locations and times, collaboratively refining documents or
diagrams ultimately producing a single agreed outcome. A natural part of this
process is the emergence of differences or conflicts reflecting divergent team
member perspectives, interpretations, skills or knowledge. This paper describes
CONFER (CONflict Free Editing in a Replicated architecture) a system that
address the handling of conflicts in collaborative software development
projects. At the technical level, CONFER detects and stores conflicts until
they are resolved by user actions. However, it is in the social domain that
these user actions are formed and resolved, so effective collaboration tools
will need to include support for conflict resolution in the social domain as
well.
Some software engineering teams will be based on models of cooperation and maintenance of team harmony may require that conflict resolution be based on discussion and consensus, rather than by authority or by simple voting systems. A consensus building approach is proposed based on a method used in travel planning (Chu-Caroll, 2000) that encourages participants to further explore alternatives together with their own proposals. The social support mechanism has been simulated using paper and pen and assessed in a small experiment. The evaluation suggests that the technique is easy to use, reduces conflict resolution time and may be a useful extension to CONFER. | |||
| Groupware support in the windowing system | | BIBA | Full-Text | PDF | 39-46 | |
| Peter Hutterer; Bruce H. Thomas | |||
| In this paper, we discuss the advantages of integrating groupware support
for Single Display Groupware (SDG) into the windowing system. For the domain of
SDG, a Groupware Windowing System (GWWS) has several advantages over
traditional SDG toolkits and applications. A GWWS provides SDG support for
legacy applications, custom built SDG applications and supports the execution
of multiple applications simultaneously. A GWWS combines the traditional
single-user single-input axiom and novel multi-user multi-input desktop
environments.
We present the Multi-Pointer X Server (MPX), the first GWWS that supports SDG natively, together with our Multi-Pointer Window Manager (MPWM). MPX and MPWM support an arbitrary number of true systems cursors, sophisticated floor control and per-window annotation overlay. To ease the interaction with such a GWWS, we implemented the DeviceShuffler, a system to couple input devices from any computer. The physical connection point of a device is transparent to both the windowing system and the application. This supports true ad-hoc collaboration on shared screens. | |||
| Locating a projector using the strength of beams reflected on a screen | | BIBA | Full-Text | PDF | 47-50 | |
| Yukio Ishihara; Makio Ishihara | |||
| In this paper, we propose a real-time calibration technique for a projector-camera system, which enables the system to locate the projector even while it is moving. Usually fiducial points are stuck on the screen and the camera tracks both the fiducial points and the quadrilateral illuminated by the projector. Then the projector is located by the tracking. Instead of the fiducial points, our real-time calibration technique uses the strength of the beams reflected on the screen. So, the projection area is not limited in size and position by fiducial points and also it can be extended over a wall as far as the reflected beams are observed. This advantage allows a user to turn the projector at any position on the screen to view the position-dependent image. We take the advantage to assemble a map viewer by connecting our projector-camera system to Google Earth, which enables a user to view the map by moving the projector. | |||
| Lessons learned from facilitation in collaborative design | | BIBA | Full-Text | PDF | 51-54 | |
| Jonas Lundberg; Mattias Arvola | |||
| The importance of a skilled facilitator in design meetings with users is often emphasized, but less is said about how to improve the facilitation process. This paper reports experiences and lessons learned from facilitation of card-based sessions in three design cases through an analysis of two sessions with users, and one session with professional designers. The analysis showed that many alternatives were not documented in the sessions with users who designed primarily by talking, compared to the professional designers who primarily designed by placing cards. We propose that facilitation, in cases similar to those presented here, could be improved by suggesting alternatives and possible consequences, prompt the participants to explore the consequences, and graphic facilitation. | |||
| SWIM: an alternative interface for MSN messenger | | BIBA | Full-Text | PDF | 55-62 | |
| Minh Hong Tran; Yun Yang; Gitesh K. Raikundalia | |||
| The research of the authors investigates an alternative interface for Instant Messaging (IM). This paper presents SWIM (SWinburne Instant Messaging), an IM tool that is developed based on MSN Messenger. SWIM presents an innovative interface design that combines the conventional sequential interface with the adaptive threaded interface. In addition, SWIM supports persistent conversation which facilitates users' participation to a group conversation. The integrated interface and support for persistent conversation allow SWIM to be used both as a convenient tool for social conversation and as an effective tool for task-oriented group discussion. In this paper, we discuss the design approach of SWIM, describe details of our implementation technique, and report a preliminary evaluation of SWIM. The evaluation shows that SWIM holds great promise in supporting group conversation. | |||
| Evaluating Swiftpoint as a mobile device for direct manipulation input | | BIBA | Full-Text | PDF | 63-70 | |
| Taher Amer; Andy Cockburn; Richard Green; Grant Odgers | |||
| This paper presents a promising new computer pointing device, called Swiftpoint, that is designed primarily for mobile computer (for example, laptop) users in constrained space. Swiftpoint has many advantages over current pointing devices: it is small, ergonomic, has a digital ink mode, and can be used over a flat keyboard. We present the results of a formal evaluation conducted to compare Swiftpoint to two of the most common pointing devices with today's mobile computers: the mouse, and touchpad. Two laws commonly used in evaluating pointing devices, Fitts' Law and the Steering Law, were used to evaluate Swift-point. Results showed that Swiftpoint was faster and more accurate than the touchpad. The performance of the mouse was however, superior to both the touchpad and Swiftpoint. | |||
| Connector semantics for sketched diagram recognition | | BIBA | Full-Text | PDF | 71-78 | |
| Isaac J. Freeman; Beryl Plimmer | |||
| Comprehensive interpretation of hand-drawn diagrams is a long-standing challenge. Connectors (arrows, edges and lines) are important components of many types of diagram. In this paper we discuss techniques for syntactic and semantic recognition of connectors. Undirected graphs, digraphs and organization charts are presented as exemplars of three broad classes that encompass many types of connected diagram. Generic techniques have been incorporated into the recognition engine of InkKit, an extensible sketch toolkit, thus reducing the development costs for sketch tools. | |||
| The gestalt principles of similarity and proximity apply to both the haptic and visual grouping of elements | | BIBA | Full-Text | PDF | 79-86 | |
| Dempsey Chang; Keith V. Nesbitt; Kevin Wilkins | |||
| When designing multi-sensory displays it is necessary to consider human
perceptual capabilities and understand how people find patterns and how they
organise individual elements into structures and groups.
Gestalt theory, originally described in 1910, attempts to explain the way people perceive and recognise patterns. The early studies of Gestalt principles of grouping were predominantly concerned with visual perception, although more recently they have been investigated for auditory perception. This paper focuses on how individuals use the sense of touch (haptics) to group display elements using the Gestalt principles of similarity and proximity. A direct comparison is made with the visual grouping of elements using the same two principles of similarity and proximity. The hypothesis of the experiment described in this paper is that people will use touch to group display elements in the same way they group elements visually. Overall we found that a significant number of subjects used texture or colour to group the elements when there was an equal spacing between the elements. This supports our hypothesis that the principle of similarity is equally applicable for both visual (colour) and haptic (texture) grouping. Similarly, when subjects perceived an unequal spacing between the elements they used spatial position to determine groupings. These results support our hypothesis that the principle of proximity is also applicable for both visual and haptic grouping. | |||
| A visual language and environment for specifying user interface event handling in design tools | | BIBA | Full-Text | PDF | 87-94 | |
| Na Liu; John Hosking; John Grundy | |||
| End users often need the ability to tailor diagramming-based design tools and to specify dynamic interactive behaviours of graphical user interfaces. However most want to avoid having to use textual scripting languages or programming language approaches directly. We describe a new visual language for user interface event handling specification targeted at end users. Our visual language provides end users with abstract ways to express both simple and complex event handling mechanisms via visual specifications. These specifications incorporate event filtering, tool state querying and action invocation. We describe our language, its incorporation into a meta-tool for building visual design environments, examples of its use and results of evaluations of its effectiveness. | |||
| Trends in sitemap designs: a taxonomy and survey | | BIBA | Full-Text | PDF | 95-102 | |
| C. J. Pilgrim | |||
| One of the challenges confronting website designers is to provide effective navigational support. Supplemental navigation tools such as search, indexes and sitemaps are frequently included on web sites. However, due to a lack of guidance for designers a proliferation of designs has evolved over time. Trends in design don't appear to be underpinned by any empirically sound and theoretically based user-focused research, instead design has been led by technological innovation rather than user needs. This paper investigates the key factors in the design of sitemaps. A taxonomy that provides a segregation of design issues into major components is proposed. The paper applies the taxonomy in a longitudinal survey of commercial sitemaps exposing several trends in design practice. The intention of this taxonomy and survey is to provide a sounder basis for future research and development of sitemap tools by clarifying existing research and identifying important issues for future investigation. | |||