HCI Bibliography Home | HCI Conferences | ASSETS Archive | Detailed Records | RefWorks | EndNote | Hide Abstracts
ASSETS Tables of Contents: 9496980002040506070809101112131415

Sixth Annual ACM SIGACCESS Conference on Assistive Technologies

Fullname:Sixth International ACM SIGACCESS Conference on Assistive Technologies
Editors:Julie A. Jacko; Andrew Sears
Location:Atlanta, GA, USA
Dates:2004-Oct-18 to 2004-Oct-20
Publisher:ACM
Standard No:ISBN 1-58113-911-X; ACM Order Number 444040; ACM DL: Table of Contents hcibib: ASSETS04
Papers:26
Pages:192
  1. Audio interactions
  2. Evaluating accessibility
  3. Accessibility infrastructure and supporting tools
  4. Cursor control
  5. Designing for individuals with visual impairments
  6. Designing for accessibility
  7. Web accessibility
Beyond tagging: the organized blind, your best ally in a proactive paradigm BIBFull-Text 1
  Betsy A. Zaborowski

Audio interactions

Audio enriched links: web page previews for blind users BIBAFull-Text 2-8
  Peter Parente
Audio Enriched Links provide previews of linked web pages to users with visual impairments. Before a user follows a hyperlink, the Audio Enriched Links software presents a spoken summary of the next page including its title, its relation to the current page, statistics about its content, and some highlights from its content. We believe that such a summary may be a useful surrogate for a full web page, and help users with visual impairments decide whether or not to spend time visiting a linked page. In this paper, we present some motivation for the Audio Enriched Links project. We describe the design and implementation of the current software prototype, and discuss the results of an initial evaluation involving four participants. We conclude with some implications of this work and directions for future research.
The audio abacus: representing numerical values with nonspeech sound for the visually impaired BIBAFull-Text 9-15
  Bruce N. Walker; Jeffrey Lindsay; Justin Godfrey
Point estimation is a relatively unexplored facet of sonification. We present a new computer application, the Audio Abacus, designed to transform numbers into tones following the analogy of an abacus. As this is an entirely novel approach to sonifying exact data values, we have begun a systematic line of investigation into the application settings that work most effectively. Results are presented for an initial study. Users were able to perform relatively well with very little practice or training, boding well for this type of display. Further investigations are planned. This could prove to be very useful for visually impaired individuals given the common nature of numerical data in everyday settings.
Rendering tables in audio: the interaction of structure and reading styles BIBAFull-Text 16-23
  Yeliz Yesilada; Robert Stevens; Carole Goble; Shazad Hussein
Tables remain a persistent problem for visually impaired people using screen readers. Tables are complex structures that are widely used for different purposes such as spatial layout or data summarisation. The multi-dimensional nature of tables challenges the linear interaction styles typically supported by screen readers. To read a table, a user needs to maintain coherency of, and interact with more than one dimension. In this paper, we first characterise why tables are useful in print, but difficult to read in the audio. We present a survey of the relationship between table structure, intention and the reading styles employed to use the content of tables. We then present two different approaches for interacting with tables non-visually. These approaches are designed to support the characteristics of tables that make them such a popular and useful means of conveying information. The first approach provides a small table browser called EVITA (Enabling Visually Impaired Table Access), whose aim is to enable non-visual table browsing and reading in an analogous manner to the print medium. The second approach provides a table lineariser to transform tables into a form such that they can be easily read by screen readers.
Memory enhancement through audio BIBAFull-Text 24-31
  Jaime Sanchez; Hactor Flores
A number of studies have proposed interactive applications for blind people. One line of research is the use of interactive interfaces based on sound to enhance cognition in blind children. Even though these studies have emphasized learning and cognition, there is still a shortage of applications to assist the development and use of memory in these children. This study presents the design, development, and usability of AudioMemory, a virtual environment based on audio to develop and use short-term memory. AudioMemory was developed by and for blind children. They participated in the design and usability tested the software during and after development. We also introduce AudioMath, an instance of AudioMemory to assist mathematics learning in children with visual disabilities. Our results evidenced that sound can be a powerful interface to develop and enhance memory and mathematics learning in blind children.

Evaluating accessibility

Accessibility of Internet websites through time BIBAFull-Text 32-39
  Stephanie Hackett; Bambang Parmanto; Xiaoming Zeng
Using Internet Archive's Wayback Machine, a random sample of websites from 1997-2002 were retrospectively analyzed for effects that technology has on accessibility for persons with disabilities and compared to government websites. Analysis of Variance (ANOVA) and Tukey's HSD were used to determine differences among years. Random websites become progressively inaccessible through the years (p<0.0001) [as shown by increasing Web Accessibility Barrier (WAB) scores], while complexity of the websites increased through the years (p<0.0001). Pearson's correlation (r) was performed to correlate accessibility and complexity: r=0.463 (p<0.01). Government websites remain accessible while increasing in complexity: r=0.14 (p<0.041). It is concluded that increasing complexity, oftentimes caused by adding new technology to a Web page, inadvertently contributes to increasing barriers to accessibility for persons with disabilities.
Evaluation of a non-visual molecule browser BIBAFull-Text 40-47
  Andy Brown; Steve Pettifer; Robert Stevens
This paper describes the evaluation of software, software designed to allow visually impaired users to explore the structures of chemical molecules using a speech based presentation. Molecular structures are typically presented as two dimensional schematics, and are an important example of a widely used form of diagram -- the graph. software is designed for exploring this specific class of graph. Among its features is the ability to recognise and make explicit features of the graph that would otherwise need to be inferred. The evaluation compared software with a simpler version without this facility, and found that participants were able to explore molecular structures more easily. We discuss the software, evaluation and results, particularly comparing them with theoretical considerations about how sighted readers use diagrams. Finally, we extract the important issues for non-visual graph presentation: making implicit features explicit; enabling hierarchical and connection-based browsing; allowing annotation; and helping users keep their orientation.
A galvanic skin response interface for people with severe motor disabilities BIBAFull-Text 48-54
  Melody M. Moore; Umang Dua
Biometric input devices can provide assistive technology access to people who have little or no motor control. We explore a biometric control interface based on the Galvanic Skin Response, to determine its effectiveness as a non-muscular channel of input. This paper presents data from several online studies of a locked-in subject using a Galvanic Skin Response system for communication and control. We present issues with GSR control, and approaches that may improve accuracy and information transfer rate.

Accessibility infrastructure and supporting tools

UMA: a system for universal mathematics accessibility BIBAFull-Text 55-62
  A. I. Karshmer; G. Gupta; E. Pontelli; K. Miesenberger; N. Ammalai; D. Gopal; M. Batusic; B. Stoger; B. Palmer; H.-F. Guo
We describe the UMA system, a system developed under a multi-institution collaboration for making mathematics universally accessible. The UMA system includes translators that freely inter-convert mathematical documents transcribed in formats used by unsighted individual (Nemeth, Marburg) to those used by sighted individuals (LaTeX, Math-ML, OpenMath) and vice versa. The UMA system also includes notation-independent tools for aural navigation of mathematics. In this paper, we give an overview of the UMA system and the techniques used for realizing it.
Middleware to expand context and preview in hypertext BIBAFull-Text 63-70
  Simon Harper; Carole Goble; Robert Stevens; Yeliz Yesilada
Movement, or mobility, is key to the accessibility, design, and usability of many hypermedia resources (websites); and key to good mobility is context and preview by probing. This is especially the case for visually impaired users when a hypertext anchor is inaccurately described or is described out of context. This means confusion and disorientation. Mobility is similarly reduced when the link target of the anchor has no relationship to the expected information present on the hypertext node (web-page). We suggest that confident movement with purpose, ease, and accuracy can only be achieved when complete contextual information and an accurate description of the proposed destination (preview) are available. Our past work (1) deriving mobility heuristics from mobility models, (2) transforming web-pages based on these heuristics, and (3) building tools to analyse and access these transformed pages; has shown us that a tool to expand context and preview would be useful. In this paper we describe the development of such a middleware tool to automatically and dynamically annotate web-pages with additional context information present within the page, and preview information present within hypertext link destinations found on the page.
Automating accessibility: the dynamic keyboard BIBAFull-Text 71-78
  Shari Trewin
People with motor disabilities may need to adjust the configuration of their input devices, but often find this an obscure and difficult process. The Dynamic Keyboard exemplifies a potential solution. It continuously adjusts fundamental keyboard accessibility features to suit the requirements of the current user, based on a keyboard use model. Early field results indicate that users have not chosen to take control of these accessibility features from the Dynamic Keyboard, and that a variety of settings are being used. A more detailed ongoing study suggests that automatic adjustment of the key repeat delay feature is acceptable to users, while the debounce feature may not be appropriate for dynamic adjustment.
MEMOS: an interactive assistive system for prospective memory deficit compensation-architecture and functionality BIBAFull-Text 79-85
  Hendrik Schulze
The Mobile Extensible Memory Aid System (MEMOS) is an electronic memory aid system which was developed to support patients with deficits in the prospective memory after a brain injury. A special palmtop computer, the Personal Memory Assistant (PMA), reminds the patient of important tasks and supervises the patient's actions. The PMA communicates with a stationary care system via a bidirectional cellular radio connection (GPRS). MEMOS features structured interactive reminding impulses, a flexible task scheduling, integration of a heterogeneous group of caregivers and integration in the patient's everyday life. The bidirectional communication allows for reporting of critical situations back to a responsible care-giver, so MEMOS can be used even in a critical context. This paper describes the requirements for a memory aid system, the design and functionality of MEMOS as well as its application in the practical care of patients.

Cursor control

Eyedraw: a system for drawing pictures with eye movements BIBAFull-Text 86-93
  Anthony Hornof; Anna Cavender; Rob Hoselton
This paper describes the design and development of EyeDraw, a software program that will enable children with severe mobility impairments to use an eye tracker to draw pictures with their eyes so that they can have the same creative developmental experiences as nondisabled children. EyeDraw incorporates computer-control and software application advances that address the special needs of people with motor impairments, with emphasis on the needs of children. The contributions of the project include (a) a new technique for using the eyes to control the computer when accomplishing a spatial task, (b) the crafting of task-relevant functionality to support this new technique in its application to drawing pictures, and (c) a user-tested implementation of the idea within a working computer program. User testing with nondisabled users suggests that we have designed and built an eye-cursor and eye drawing control system that can be used by almost anyone with normal control of their eyes. The core technique will be generally useful for a range of computer control tasks such as selecting a group of icons on the desktop by drawing a box around them.
Speech-based cursor control: a study of grid-based solutions BIBAFull-Text 94-101
  Liwei Dai; Rich Goldman; Andrew Sears; Jeremy Lozier
Speech recognition can be a powerful tool for use in human-computer interaction. Many researchers are investigating the use of speech recognition systems for dictation-based activities, resulting in dramatic improvements in recent years. However, this same experimentation has confirmed that recognition errors and the delays inherent with speech recognition result in unacceptably long task completion times and error rates for cursor control tasks. This study explores the potential of a speech-controlled grid-based cursor control mechanism. An experiment evaluated two alternative grid-based solutions, both using 3-3 grids. One provided a single cursor in the middle of the grid. The second allows users to select a target using any of nine cursors. The results confirm that the nine-cursor solution allowed users to select targets of varying size, distance and direction significantly faster than the one-cursor solution. Overall results are encouraging when compared to earlier evaluations of other speech-based cursor control solutions.
Mouse movements of motion-impaired users: a submovement analysis BIBAFull-Text 102-109
  Faustina Hwang; Simeon Keates; Patrick Langdon; John Clarkson
Understanding human movement is key to improving input devices and interaction techniques. This paper presents a study of mouse movements of motion-impaired users, with an aim to gaining a better understanding of impaired movement. The cursor trajectories of six motion-impaired users and three able-bodied users are studied according to their submovement structure. Several aspects of the movement are studied, including the frequency and duration of pauses between submovements, verification times, the number of submovements, the peak speed of submovements and the accuracy of submovements in two-dimensions. Results include findings that some motion-impaired users pause more often and for longer than able-bodied users, require up to five times more submovements to complete the same task, and exhibit a correlation between error and peak submovement speed that does not exist for able-bodied users.
Text entry from power wheelchairs: edgewrite for joysticks and touchpads BIBAFull-Text 110-117
  Jacob O. Wobbrock; Brad A. Myers; Htet Htet Aung; Edmund F. LoPresti
Power wheelchair joysticks have been used to control a mouse cursor on desktop computers, but they offer no integrated text entry solution, confining users to point-and-click or point-and-dwell with on-screen keyboards. But on-screen keyboards reduce useful screen real-estate, exacerbate the need for frequent window management, and impose a second focus of attention. By contrast, we present two integrated gestural text entry methods designed for use from power wheelchairs: one for joysticks and the other for touchpads. Both techniques are adaptations of EdgeWrite, originally a stylus-based unistroke method designed for people with tremor. In a preliminary study of 7 power wheelchair users, we found that touchpad EdgeWrite was faster than joystick WiVik, and joystick EdgeWrite was only slightly slower after minimal practice. These findings reflect "walk up and use"-ability and warrant further investigation into extended use.

Designing for individuals with visual impairments

Strategic design for users with diabetic retinopathy: factors influencing performance in a menu-selection task BIBAFull-Text 118-125
  Paula J. Edwards; Leon Barnard; V. Kathlene Emery; Ji Soo Yi; Kevin P. Moloney; Thitima Kongnakorn; Julie A. Jacko; Francois Sainfort; Pamela R. Oliver; Joseph Pizzimenti; Annette Bade; Greg Fecho; Josephine Shallo-Hoffmann
This paper examines factors that affect performance of a basic menu selection task by users who are visually healthy and users with Diabetic Retinopathy (DR) in order to inform better interface design. Interface characteristics such as multimodal feedback, Windows accessibility settings, and menu item location were investigated. Analyses of Variance (ANOVA) were employed to examine the effects of interface features on task performance. Linear regression was used to further examine and model various contextual factors that influenced task performance. Results indicated that Windows accessibility settings significantly improved performance of participants with more progressed DR. Additionally, other factors, including age, computer experience, visual acuity, and menu location were significant predictors of the time required for subjects to complete the task.
Image pre-compensation to facilitate computer access for users with refractive errors BIBAFull-Text 126-132
  Miguel, Jr. Alonso; Armando Barreto; J. Gualberto Cremades
The use of computer technology for everyday tasks has become increasingly important in today's world. Frequently, computer technology makes use of Graphical User Interfaces (GUIs), presented through monitors or LCD displays. This type of visual interface is not well suited for users with visual limitations due to refractive errors, particularly when they are severe and not correctable by common means. In order to facilitate computer access for users with refractive deficiencies, an algorithm was developed, using a priori knowledge of the visual aberration, to generate an inverse transformation of the images that are then displayed on-screen, countering the effect of the aberration. The result is that when the user observes the screen displaying the transformed images, the image perceived in the retina will be similar to the original image. The algorithm was tested by artificially introducing a spherical aberration in the field of view of 14 subjects, totaling 28 individual eyes. Results show that when viewing the screen, this method of compensation improves the visual performance of the subjects tested in comparison to viewing uncompensated images.
Nonvisual tool for navigating hierarchical structures BIBAFull-Text 133-139
  Ann C. Smith; Justin S. Cook; Joan M. Francioni; Asif Hossain; Mohd Anwar; M. Fayezur Rahman
The hierarchical structure of a program can be quite complex. As such, many Integrated Development Environments (IDEs) provide graphical representations of program structure at different levels of abstraction. Such representations are not very accessible to non-sighted programmers, as screen readers are not able to portray the underlying hierarchical structure of the information. In this paper, we define a set of requirements for an accessible tree navigation strategy. An implementation of this strategy was developed as a plug-in to the Eclipse IDE and was tested by twelve student programmers. The evaluation of the tool shows the strategy to be an efficient and effective way for a non-sighted programmer to navigate hierarchical structures.

Designing for accessibility

Designing a cognitive aid for the home: a case-study approach BIBAFull-Text 140-146
  Jessica Paradise; Elizabeth D. Mynatt; Cliff Williams; John Goldthwaite
Cognitive impairments play a large role in the lives of surviviors of mild traumatic brain injuries who are unable to return to their prior level of independence in their homes. Computational support has the potential to enable these individuals to regain control over some aspects of their lives. Our research aims to carefully seek out issues that might be appropriate for computational support and to build enabling technologies that increase individuals' functional independence in the home environment. Using a case-study approach, we explored the needs and informed the design of a pacing aid for an individual with a cognitive impairment whose quality of life was negatively affected by her inability to pace herself during her morning routine. The contributions of this research include insights we gained with our methodology, two sets of design dimensions: user-centered contraints developed from capabilities and preferences of our users and system-centered capabilities that could be explored in potential designs, a design concept which illustrates the application of these design dimensions into a potential pacing aid, and evaluations of paper prototypes guided by the design dimensions.
Design and development of an indoor navigation and object identification system for the blind BIBAFull-Text 147-152
  Andreas Hub; Joachim Diepstraten; Thomas Ertl
In this paper we present a new system that assists blind users in orienting themselves in indoor environments. We developed a sensor module that can be handled like a flashlight by a blind user and can be used for searching tasks within the three-dimensional environment. By pressing keys, inquiries concerning object characteristics, position, orientation and navigation can be sent to a connected portable computer, or to a federation of data servers providing models of the environment. Finally these inquiries are acoustically answered over a text-to-speech engine.
The ethnographically informed participatory design of a PD application to support communication BIBAFull-Text 153-160
  Rhian Davies; Skip Marcella; Joanna McGrenere; Barbara Purves
Aphasia is an acquired communication deficit that impacts the different language modalities. PDAs have a form factor and feature set that suggest they could be effective communication tools for people with aphasia. An ethnographic study was conducted with one participant both to learn about communication strategies used by people with aphasia, and to observe how a PDA is incorporated into those strategies. The most significant usability issues found were file access and organization. A participatory design phase followed, resulting in a paper prototype of a file management system that addressed the key usability issues identified. The participatory approach continued during the implementation of a high-fidelity prototype.
visiBabble for reinforcement of early vocalization BIBAFull-Text 161-168
  Harriet Fell; Cynthia Cress; Joel MacAuslan; Linda Ferrier
The visiBabble system processes infant vocalizations in real-time. It responds to the infant's syllable-like productions with brightly colored animations and records the acoustic-phonetic analysis. The system reinforces the production of syllabic utterances that are associated with later language and cognitive development. We report here on the development of the visiBabble prototype and field-testing of the system.

Web accessibility

A web accessibility service: update and findings BIBAFull-Text 169-176
  Vicki L. Hanson; John T. Richards
We report here on our progress on a project first described at the ASSETS 2002 conference. At that time, we had developed a prototype system in which a proxy server intermediary was used to adapt Web pages to meet the needs of older adults. Since that report, we field tested the prototype and learned of problems with the proxy approach. We report on the lessons learned from that work and on our new approach towards meeting the Web needs of older adults and users with disabilities. This new software makes adaptations on the client machine, with greater accuracy and speed than was possible with the proxy server approach. It transforms Web pages "on the fly", without requiring that all Web content be re-written. The new software has been in use for a year and we report here on our findings from the usage. We discuss this approach in the context of Web accessibility standards and Web usability.
Accessibility designer: visualizing usability for the blind BIBAFull-Text 177-184
  Hironobu Takagi; Chieko Asakawa; Kentarou Fukuda; Junji Maeda
These days, accessibility-related regulations and guidelines have been accelerating the improvement of Web accessibility. One of the accelerating factors is the development and deployment of accessibility evaluation tools for authoring time and repair time. They mainly focus on creating compliant Web sites by analyzing the HTML syntax of pages, and report that pages are compliant when there are no syntactical errors. However, such compliant pages are often not truly usable by blind users. This is because current evaluation tools merely check if the HTML tags are appropriately used to be compliant with regulations and guidelines. It would be better if such tools paid more attention to real usability, especially on time-oriented usability factors, such as the speed to reach target content, the ease of understanding the page structure, and the navigability, in order to help Web designers to create not simply compliant pages but also usable pages for the blind. Therefore, we decided to develop Accessibility Designer (aDesigner), which has capabilities to visualize blind users' usability by using colors and gradations. The visualization function allows Web designers to grasp the weak points in their pages, and to recognize how accessible or inaccessible their pages are at a glance. In this paper, after reviewing the related work, we describe our approach to visualize blind users' usability followed by an overview of Accessibility Designer. We then report on our evaluations of real Web sites using Accessibility Designer. After discussing the results, we conclude the paper.
Semantic bookmarking for non-visual web access BIBAFull-Text 185-192
  Saikat Mukherjee; I. V. Ramakrishnan; Michael Kifer
Bookmarks are shortcuts that enable quick access of the desired Web content. They have become a standard feature in any browser and recent studies have shown that they can be very useful for non-visual Web access as well. Current bookmarking techniques in assistive Web browsers are rigidly tied to the structure of Web pages. Consequently they are susceptible to even slight changes in the structure of Web pages. In this paper we propose semantic bookmarking for non-visual Web access. With the help of an ontology that represents concepts in a domain, content in Web pages can be semantically associated with bookmarks. As long as these associations can be identified, semantic bookmarks are resilient in the face of structural changes to the Web page. The use of ontologies allows semantic bookmarks to span multiple Web sites covered by a common domain. This contributes to the ease of information retrieval and bookmark maintenance. In this paper we describe highly automated techniques for creating and retrieving semantic bookmarks. These techniques have been incorporated into an assistive Web browser. Preliminary experimental evidence suggests the effectiveness of semantic bookmarks for non-visual Web access.