| Towards a Visual Speech Learning System for the Deaf by Matching Dynamic Lip Shapes | | BIBAK | Full-Text | 1-9 | |
| Shizhi Chen; D. Michael Quintian; YingLi Tian | |||
| In this paper we propose a visual-based speech learning framework to assist
deaf persons by comparing the lip movements between a student and an E-tutor in
an intelligent tutoring system. The framework utilizes lip reading technologies
to determine if a student learns the correct pronunciation. Different from
conventional speech recognition systems, which usually recognize a speaker's
utterance, our speech learning framework focuses on recognizing whether a
student pronounces are correct according to an instructor's utterance by using
visual information. We propose a method by extracting dynamic shape difference
features (DSDF) based on lip shapes to recognize the pronunciation difference.
The preliminary experimental results demonstrate the robustness and
effectiveness of our approach on a database we collected, which contains
multiple persons speaking a small number of selected words. Keywords: Lip Reading; Speech Learning; Dynamic Shape Difference Features; Deaf people | |||
| Teaching Support Software for Hearing Impaired Students Who Study Computer Operation | | BIBAK | Full-Text | 10-17 | |
| Makoto Kobayashi; Takuya Suzuki; Daisuke Wakatsuki | |||
| Teaching support software for the hearing impaired students who study
computer operation was developed. It is named as SZKIT. The software shows
icons of modifier keys when the teacher presses modifier keys and shows mouse
icon when he/she clicks a mouse button. These icons appear near the mouse
cursor. By this function, a difference between simple dragging and dragging
with modifier key can be distinguishable. For the hearing impaired students, it
is difficult to distinguish such differences without voice information, because
the motions of the mouse cursor on the screen are almost same. Also SZKIT can
show instruction texts under the mouse cursor. The timing of changing the texts
is controlled by a hot key, keeping the focus on the main application software.
From the results of questionnaire to the hearing impaired students, it is clear
that SZKIT is useful to learn computer operation. Keywords: hearing impaired student; learning computer operation; modifier keys | |||
| The Hybrid Book -- One Document for All in the Latest Development | | BIBAK | Full-Text | 18-24 | |
| Petr Hladík; Tomáaš Gura | |||
| The term "Hybrid Book" stands for a digital document with a synchronized
multimedia content. In the narrower sense, the Hybrid Book is a name of a
technology used at Masaryk University for creation of study materials for users
with a variety of information channel impairments: the blind, the deaf,
dyslectics, and others. A document in this format can include a digital text,
an audio recording of a text read by a human voice, and a video recording of a
translation of a text into a sign language. These records are shown
simultaneously by the given software application when browsing documents. A
user can navigate in documents using a variety of specific navigation
functions. The Hybrid Book does not only compensate for an information channel;
for example, it can also be used as a unique system for creation of foreign
language textbooks. Keywords: (e)Accessibility; Assistive Technology; Design for All; eLearning and
Universal Learning Design | |||
| Dealing with Changes in Supporting Students with Disabilities in Higher Education | | BIBAK | Full-Text | 25-32 | |
| Andrea Petz; Klaus Miesenberger | |||
| This paper discusses necessary changes and adaptations faced in supporting
students with disabilities at Linz University within the last 20 years and the
methodology used compared to other support schemes around Europe. The research
is based on findings from the study "Social Situation of People with
Disabilities in Austria", the only formal Austrian study also dealing with
disability and higher education (as information on a possible "disability" is
numbered among "highly sensitive personal data" and therefore not formally
surveyed during enrollment), findings from an own survey collecting information
from support structures for students with disabilities at Universities in
Europe and experiences from supporting students with most diverse
(dis-)abilities, skills and knowledge. Keywords: Counseling; Support; Disability; Students; University; Higher Education;
Social Inclusion | |||
| Putting the Disabled Student in Charge: Introduction to the Special Thematic Session | | BIBA | Full-Text | 33-35 | |
| Lisa Featherstone; Simon Ball | |||
| Students with disabilities or impairments have often been passive recipients of 'inclusive practice' or 'assistive technology'. The Special Thematic Session (STS) on Putting the Disabled Student in Charge focusses on topics that have a direct impact upon the education of disabled students and covers all aspects of disabled students' education, from the development of resources to full participation in lectures and collaborative work and the provision of alternative formats. | |||
| Biblus -- A Digital Library to Support Integration of Visually Impaired in Mainstream Education | | BIBAK | Full-Text | 36-42 | |
| Lars Ballieu Christensen; Tanja Stevns | |||
| This paper presents the background, status, challenges and planned future
directions of the Danish Biblus project which aims creating a digital library
solution to be used to support the integration of visually impaired pupils and
students in the mainstream educational system. As a supplement to the
RoboBraille alternative media conversion system as well as a stand-alone
repository for copyrighted educational material in alternate formats, Biblus
was created to allow students, teachers, visual impairment professionals and
relatives to access digital versions of educational material. Subject to proper
access rights, material can either be delivered directly to the user in the
formats stored in the library or indirectly via RoboBraille as mp3 files, Daisy
full text/full audio, e-books or Braille books. Future versions of Biblus will
be available in multiple languages and include digital rights management as
well as support for decentralised contribution of material. Keywords: Digital library; inclusion; integration; mainstreaming; educational
material; alternative media; Braille transcription; Daisy; mp3; e-book; blind;
partially sighted; visually impaired; dyslexic; dyslexia | |||
| Alternative Approaches to Alternative Formats -- Changing Expectations by Challenging Myths | | BIBAK | Full-Text | 43-50 | |
| Alistair McNaught; Lisa Featherstone | |||
| Traditional textbooks can be difficult for print impaired learners to
access. Every organisation has its own approach to providing alternative
formats but there are common myths that need to be dispelled: the myth of
responsibility, that alternative formats should be provided by disability
support staff; the myth of specialism, that disabled students should be dealt
with by a small team of specialist staff; and the myth that e-books are
automatically accessible. This paper suggests a move beyond alternative formats
to looking at alternative approaches to meeting the needs of print impaired
students. Keywords: e-books; alternative formats; dyslexia; libraries | |||
| Access Toolkit for Education | | BIBAK | Full-Text | 51-58 | |
| Mike Wald; E. A. Draffan; Russell Newman; Sebastian Skuse; Chris Phethean | |||
| This paper describes three tools that have been developed to help overcome
accessibility, usability and productivity issues identified by disabled
students. The Web2Access website allows users to test any Web 2.0 site or
software application against a series of checks linked to the WCAG 2.0 and
other guidelines. The Access Tools accessible menu helps with navigation to
portable pen drive applications that can assist with accessibility,
productivity and leisure activities when on the move. The accessible Toolbar
provides support for the majority of browsers and accessible websites through
magnification, spellchecking, text to speech readout, dictionary definitions
and referencing modification of text, page style, colour and layout. Keywords: accessibility; tool; learning | |||
| Community-Based Participatory Approach: Students as Partners in Educational Accessible Technology Research | | BIBAK | Full-Text | 59-64 | |
| Poorna Kushalnagar; Benjamin Williams; Raja S. Kushalnagar | |||
| This paper discusses the critical role of bringing together students with
disabilities as research partners using principles of community-based
participatory research (CBPR). Most accessible technology research approaches
include the target population as end-users, not as community partners. This
paper describes how CBPR can enhance designs and increase likelihood of
effective and efficiency of end-user designs or prototypes that impact students
in education. We conclude with a discussion on how to empower students as
research partners using CBPR principles. Keywords: Accessible Technology Research; Design and Evaluation; Students with
Disabilities; Participatory Research | |||
| Applying New Interaction Paradigms to the Education of Children with Special Educational Needs | | BIBAK | Full-Text | 65-72 | |
| Paloma Cantón; Ángel L. González; Gonzalo Mariscal; Carlos Ruiz | |||
| The proliferation of new devices over the last decade has introduced new
ways of interaction such us tactile (iPhone [1]) or touchless gesture (Kinect
[2]) user interfaces. This opens up new opportunities for the education of
children with special needs. However, it also raises new issues. On the one
hand, children have to be able to manage different technologies, some of which
do not enable natural ways of interaction. On the other hand, software
developers have to design applications compatible with many different
platforms. This paper offers a state-of-the-art discussion about how new
interaction paradigms are being applied in the field of education. As a
preliminary conclusion, we have detected the need for a standard on
gesture-based interfaces. With this in mind, we propose a roadmap setting out
the essential steps to be followed in order to define this standard based on
natural hand movements. Keywords: SEN; Education; Touch; Touchless; Gesture; User Interface; Kinect;
Interaction Paradigms | |||
| InStep: A Video Database Assessment Tool | | BIBAK | Full-Text | 73-76 | |
| Fern Faux; David Finch; Lisa Featherstone | |||
| InStep is an Open Source video database assessment tool designed to provide
reliable assessment for students with LLDD in areas not covered by traditional
measures. Videos of students undertaking specific activities are shown
side-by-side so that changes in development over time can be seen. InStep has
been trialled to measure how well teachers can take suitable videos, the
reliability of the assessments and whether learners and their parents could
recognise progress using the tool. Keywords: Assessment; LLDD | |||
| SCRIBE: A Model for Implementing Robobraille in a Higher Education Institution | | BIBAK | Full-Text | 77-83 | |
| Lars Ballieu Christensen; Sean J. Keegan; Tanja Stevns | |||
| The provision of alternate formats for students with print-based
disabilities can be challenging. Producing educational material in alternate
formats is often time consuming, expensive and requires special knowledge and
training of staff. Therefore, in most settings, students are dependent on
others, such as disability service personnel or external producers, to obtain
their academic materials in their preferred accessible format. Even with these
resources available, students may still encounter delays in receiving their
alternate formats in a timely manner. For example, a student receiving an
inaccessible version of a hand-out or other academic content from a professor
on a Friday afternoon may be required to wait until the next business week to
receive an accessible version of the document as most institutions or external
providers do not run their alternate format production centres seven days per
week, year-round. The RoboBraille service offers fully automated conversion of
text into a number of alternate formats allowing the individual student to be
independent. This paper describes how the RoboBraille Service was turned into a
self-service solution for students at Stanford University, called the Stanford
Converter into Braille and E-Text -- or SCRIBE. The overall purpose of SCRIBE
is to encourage students to become self-sufficient by simplifying the
production of accessible formats. Keywords: Alternate formats; accessibility; self-sufficiency; conversion; educational
material; print-based disability; Braille; MP3; DAISY; e-books; student
independence | |||
| Identifying Barriers to Collaborative Learning for the Blind | | BIBAK | Full-Text | 84-91 | |
| Wiebke Köhlmann | |||
| Digital materials can help blind and visually impaired students to
participate in e-learning and collaborative settings. The use of multimedia
content enhances the learning experience of sighted students, but new barriers
arise for the visually impaired. This paper describes surveys on e-learning and
collaborative settings, defines existing barriers and presents a survey on the
use of computer usage, e-learning and collaborative learning amongst 42 blind
and visually impaired users in educational and professional life. Keywords: Collaborative learning; CSCL; virtual classroom; e-learning; accessibility;
visually impaired; survey | |||
| Deaf and Hearing Students' Eye Gaze Collaboration | | BIBAK | Full-Text | 92-99 | |
| Raja S. Kushalnagar; Poorna Kushalnagar; Jeffrey B. Pelz | |||
| In mainstreamed lectures, deaf students face decision-making challenges in
shifting attention from looking at the visual representation of the lecture
audio, i.e., sign language interpreter or captions. They also face challenges
in looking at the simultaneous lecture visual source, i.e., slides, whiteboard
or demonstration. To reduce the decision-making challenge for deaf student
subjects, we analyze the efficacy of using hearing students' eye gaze and
target as reference cues in lectures. When deaf students view the same lectures
with reference cues, they show less delay in switching to the active visual
information source and report high satisfaction with the reference cues. The
students who liked the cued notifications were more likely to demonstrate
reduction in delay time associated with shifting visual attention. Keywords: deaf; hearing; attention switching; cues | |||
| The Musibraille Project -- Enabling the Inclusion of Blind Students in Music Courses | | BIBAK | Full-Text | 100-107 | |
| José Antonio Borges; Dolores Tomé | |||
| The Musibraille Project was created to address the difficulties to include
blind students in music courses in Brazil. The strategy of this project
involves the development of powerful software for Braille music edition,
building of an online library of Braille music and the application of intensive
courses on music transcription, both for blind and non-blind people. This
project is producing an extraordinary effect on revitalizing Braille Music in
this country, with hundred of teachers and students already trained. Keywords: Assistive technology; Education of blind; Braille Music | |||
| Important New Enhancements to Inclusive Learning Using Recorded Lectures | | BIBAK | Full-Text | 108-115 | |
| Mike Wald | |||
| This paper explains three new important enhancements to Synote, the freely
available, award winning, open source, web based application that makes web
hosted recordings easier to access, search, manage, and exploit for learners,
teachers and other users. The facility to convert and import narrated
PowerPoint PPTX files means that teachers can capture and caption their
lectures without requiring institution-wide expensive lecture capture or
captioning systems. Crowdsourcing correction of speech recognition errors
allows for sustainable captioning of any originally uncaptioned lecture while
the development of an integrated mobile speech recognition application enables
synchronized live verbal contributions from the class to also be captured
through captions. Keywords: speech recognition; recorded lectures; learning | |||
| Development of New Auditory Testing Media with Invisible 2-Dimensional Codes for Test-Takers with Print Disabilities | | BIBA | Full-Text | 116-123 | |
| Mamoru Fujiyoshi; Akio Fujiyoshi; Akiko Ohsawa; Toshiaki Aomatsu; Haruhiko Sawazaki | |||
| Utilizing invisible 2-dimensional codes and digital audio players with a 2-dimensional code scanner, we developed two types of new auditory testing media. The result of experimental evaluation of the new testing media shows that, in addition to existing special accommodations such as large-print-format test and braille-format test, the introduction of the new auditory testing media enables all test-takers with print disabilities, including the newly blind, the severely partially sighted and the dyslexic, to take the National Center Test for University Admissions. | |||
| More Accessible Math | | BIBAK | Full-Text | 124-129 | |
| John Gardner; Courtney Christensen | |||
| Blind people generally access written information linearly -- through
Braille or speech/audio. Math can be written in linear form, e.g. LaTeX,
MathML, computer programming languages, or word descriptions. These forms are
too verbose to be practical for reading any but the simplest math equations.
They are even worse for authoring or "doing pencil and paper math". Braille is
more useful, but relatively few blind people are fluent in any of the many
special Braille math codes, none of which is robust enough for back-translation
to be useful for authoring math. The authors of this paper have developed a
very compact notation, which could be the basis of a new math Braille font, but
which is useful today for reading / writing using computers with all common
speech screen readers. Translators to/from MathML have been written and
integrated with Microsoft Word / MathType. Preliminary usability data will be
reported. Keywords: linear math notation; Braille math codes; audio math | |||
| Accessible Authoring Tool for DAISY Ranging from Mathematics to Others | | BIBAK | Full-Text | 130-137 | |
| Katsuhito Yamaguchi; Masakazu Suzuki | |||
| Although DAISY is an excellent solution for various print-disabled people,
producing DAISY content is not necessarily accessible works. In particular, it
is almost impossible for them to edit technical DAISY content such as
mathematics. Here, a new accessible authoring tool to enable both of sighted
people and the print disabled to produce/edit easily a DAISY book ranging from
mathematics to others is shown. In it, since a new function to control speech
output is implemented, all the content is read out in a correct manner with
speech synthesis. This approach can be applied also to DAISY content in many
languages other than English or Japanese. Keywords: DAISY; mathematics; authoring tool; speech control | |||
| Blind Friendly LaTeX | | BIBAK | Full-Text | 138-141 | |
| Wanda Gonzúrová; Pavel Hrabák | |||
| This article focuses on the accessibility of study materials containing
mathematics to visually impaired students and students with learning
disabilities. The electronic editable document (EED) is introduced within the
legislative frame of the "Rules for providing support to the public
universities" in the Czech Republic. An idea how to fulfil the requirements of
EED by creating a document combining structured text in MS Word with
mathematics in LaTeX code is presented. For this purposes it is necessary to
define strict and simple rules for LaTeX keeping the code translatable. Basic
principles of Czech standard for mathematics in Braille are presented as an
inspiration. Keywords: Czech standards for mathematics in Braille; electronic editable document
(EED); blind friendly LaTeX | |||
| A System for Matching Mathematical Formulas Spoken during a Lecture with Those Displayed on the Screen for Use in Remote Transcription | | BIBA | Full-Text | 142-149 | |
| Yoshinori Takeuchi; Hironori Kawaguchi; Noboru Ohnishi; Daisuke Wakatsuki; et al | |||
| A system is described for extracting and matching mathematical formulas presented orally during a lecture with those simultaneously displayed on the lecture room screen. Each mathematical formula spoken by the lecturer and displayed on the screen is extracted and shown to the transcriber. Investigation showed that, in a lecture in which many mathematical formulas were presented, about 80% of them were both spoken and pointed to on the screen, meaning that the system can help a transcriber correctly transcribe up to 80% of the formulas presented. A speech recognition system is used to extract the formulas from the lecturer's speech, and a system that analyzes the trajectory of the end of the stick pointer is used to extract the formulas from the projected images. This information is combined and used to match the pointed-to formulas with the spoken ones. In testing using actual lectures, this system extracted and matched 71.4% of the mathematical formulas both spoken and displayed and presented them for transcription with a precision of 89.4%. | |||
| Supporting Braille Learning and Uses by Adapting Transcription to User's Needs | | BIBAK | Full-Text | 150-157 | |
| Bruno Mascret; Alain Mille; Vivien Guillet | |||
| This paper focuses on how to improve accessibility for Braille readers on
Internet. We criticize actual technologies and show their limits in scientific
Braille and Braille personnalization, especially in pedagogical situations. We
present NAT Braille, a free software solution designed to respond to
pedagogical specific needs. The transcribing process uses a set of customizable
XSLT transformations and several XML formats. We detail the design of NAT
Braille and the technologies used for transcriptions. Then we explain why NAT
Braille improves personnalization in Braille rendering on Internet. We give the
example of our Mozilla extension which is able to transcribe web pages
including MathML markup, and is set up with adapted transcription rules taking
into account the user's preferences. We conclude by raising issues related to
our work. Keywords: Accessibility; Braille; Pedagogy; Web based education | |||
| A Non-visual Electronic Workspace for Learning Algebra | | BIBA | Full-Text | 158-165 | |
| Nancy Alajarmeh; Enrico Pontelli | |||
| In this paper we describe a multi-layer system that is designed to help students who have moderate to severe visual impairments learn algebra while manipulating algebraic equations through an interactive non-visual web-based workspace. The functional algebraic transformation options provided in the interactive system through its various layers, and the carefully provided help associated to each of those domain specific manipulation functions enhanced the overall process by which students who are visually impaired learn and deal with solving equations in the developed non-visual workspace. | |||
| Interaction Design for the Resolution of Linear Equations in a Multimodal Interface | | BIBAK | Full-Text | 166-173 | |
| Silvia Fajardo-Flores; Dominique Archambault | |||
| This article belongs to the field of Human-Computer Interaction, in the
context of the access to Mathematics for people with visual disabilities. In a
school scenario, the students with blindness who learn Algebra need to work on
mathematical expressions, to collaborate and to communicate with their
classmates and teacher. This interaction is not straightforward between
students with and without sight, due to the different modalities they use in
order to represent mathematical contents and to work with them. The computer
presents a great opportunity to promote this type of interaction, because it
allows the multimodal representation of mathematical contents. After the
conduction of experiments on linear equation solving with students with and
without sight, we have modelled their intentions and actions and we present a
proposal for the interactions required in a multimodal interface serving this
purpose. Lastly, we consider the possibilities and limitations for
implementation. Keywords: visual disability; accessibility; mathematics; HCI | |||
| Development of Software for Automatic Creation of Embossed Graphs | | BIBAK | Full-Text | 174-181 | |
| Tetsuya Watanabe; Toshimitsu Yamaguchi; Masaki Nakagawa | |||
| To investigate appropriate representation of numerical data to blind people,
a user experiment was conducted. Its results have shown that embossed graphs
give quicker and correct access to the data than braille and electronic tables.
Based on this observation, we started developing software for creation of
embossed graphs which can be operated by blind people. Up until now line graphs
can be created with this software. Keywords: Blind People; Tactile Graphs; Tabular Forms; Braille; Mathematics and
Science | |||
| Expression Rules of Directed Graphs for Non-visual Communication | | BIBA | Full-Text | 182-185 | |
| Ryoji Fukuda | |||
| This paper propose expression rules to describe directed graphs for communication without visual information and corresponding explanation documents. The structures of directed graphs are often complicated especially when they describe visual contents. For the importance of the nodes and edges in these directed graphs, an evaluation method is proposed and this will simplify the structures of the graphs. | |||
| How to Make Unified Modeling Language Diagrams Accessible for Blind Students | | BIBAK | Full-Text | 186-190 | |
| Karin Müller | |||
| In this paper, we present a survey of the material used in the computer
sciences lectures of two blind students showing that they have to deal with a
high number and various types of UML diagrams. We also report on different
textual representations of UML and present our own solutions. Moreover, we
point at a current initiative, BLINDUML, which works at solutions for making
UMLs accessible. Keywords: accessible UML | |||
| AutOMathic Blocks Usability Testing Phase One | | BIBAK | Full-Text | 191-195 | |
| Yonatan Breiter; Arthur Karshmer; Judith Karshmer | |||
| The AutOMathic Blocks [1] system has been designed to help young blind
students learn arithmetic and beginning algebra through the use of tactile [2,
3] blocks that display their work in two-dimensional space. The traditional
method of presenting math problem presentation uses special Braille-like codes
that present information in a linear form. It is our hypothesis that learning
math via a two-dimensional method will expedite and improve the learning
experience for young children. Before upgrading our prototype system, we have
chosen to first carry out usability testing experiments testing the advantage
of using tactile two-dimensional presentation methods. Keywords: AutOMathic; Blind; Math | |||
| MathInBraille Online Converter | | BIBA | Full-Text | 196-203 | |
| Klaus Miesenberger; Mario Batusic; Peter Heumader; Bernhard Stöger | |||
| MathInBraille offers an online portal for converting mathematical formulae and e-Documents with mathematical content into Braille and spoken formats. MathInBraille provides an open conversion service, which can be used for free by anybody what should help to increase access, use and availability of math content for blind people. | |||
| The Effects of Teaching Mathematics to Students with Disabilities Using Multimedia Computer-Assisted Instruction Coupled with ARCS Model | | BIBAK | Full-Text | 204-206 | |
| Chen-Tang Hou; Chu-Lung Wu | |||
| This study aims to design Multimedia Computer Assisted Instruction (MCAI)
coupled with ARCS (Attention, Relevance, Confidence and Satisfaction) model of
learning motivation and to investigate the effects of teaching mathematics
using MCAI coupled with ARCS model for elementary school students with
disabilities. The participants are recruited from the resource room and general
classes. The multiple-probe across behavior design is utilized in the study.
The independent variable is the strategies of MCAI coupled with ARCS model, and
the dependent variables are the performances of learning mathematics. The
results indicated that the MCAI program coupled with ARCS model of learning
motivation promotes participants' mathematics performance. Keywords: Multimedia Computer Assisted Instruction (MCAI); Students with Disabilities;
ARCS Model; Teaching Mathematics | |||
| Information Needs Related to ICT-Based Assistive Solutions | | BIBAK | Full-Text | 207-214 | |
| Renzo Andrich; Valerio Gower; Sabrina Vincenti | |||
| Within the ETNA project -- a European Thematic Network aimed at implementing
a EU-wide Portal devoted to ICT-based assistive technologies and
e-accessibility solution -- a study was carried out to detect the information
needs of the various stakeholders involved, such as end-users of assistive
technologies, professionals in health, social services and education,
manufacturers and developers, policy makers and academic/researchers. Thirty
"search profiles" were identified, each related to a specific reason why
information may be sought in response to a specific information need that
people may encounter at given times. In turn, each profile involves a specific
body of information. This study provides a detailed insight in the audience's
expectations, that is guiding the design of the future Portal. The Portal will
stem by the existing Portal of the European Assistive Technology Information
Network (EASTIN), enriched by the contributions brought by the ETNA project and
its "sister" ATIS4All Thematic Network. Keywords: Information needs; Information systems; Assistive solutions; eAccessibility
solutions | |||
| The European Assistive Technology Information Portal (EASTIN): Improving Usability through Language Technologies | | BIBAK | Full-Text | 215-222 | |
| Valerio Gower; Renzo Andrich; Andrea Agnoletto; Petra Winkelmann; Thomas Lyhne; et al | |||
| The EASTIN Portal -- which aggregates the contents of six national databases
and make it searchable in 22 European languages -- is currently the major
information system on assistive technology available in Europe. Its usability
has been recently improved through the use of advanced language technologies,
thanks to the EU-funded project EASTIN-CL. The project developed three main
components (the query processing, the machine translation, and the speech
output) that have been implemented and plugged to the existing EASTIN website. Keywords: Language technology; AT information; Search query processing | |||
| Use of Assistive Technology in Workplaces of Employees with Physical and Cognitive Disabilities | | BIBAK | Full-Text | 223-226 | |
| Kirsi Jääskeläinen; Nina Nevala | |||
| Information technology (IT), especially assistive devices and programs,
enable people with disabilities to work. The aim of this study was to determine
the knowledge and use of this IT among workers with disabilities in the open
labor market. The focus was on the IT accommodation solutions used in
workplaces and how these improved the working skills of disabled people. One
fourth (27%) of the participants considered their knowledge regarding assistive
technology to be very good or good, whereas 39% considered their knowledge to
be very poor or poor. Workers with visual disorders were the most aware of
assistive technology in computer work. Over half of the respondents indicated
that the user interface, display screen, and mouse settings of their computers
were not accommodated. Keywords: Assistive technology; Workplace Accommodation; Disability; Disabled workers;
Computer work; Information technology; Employment | |||
| Multimodal Guidance System for Improving Manual Skills in Disabled People | | BIBAK | Full-Text | 227-234 | |
| Mario Covarrubias; Elia Gatti; Alessandro Mansutti; Monica Bordegoni; Umberto Cugini | |||
| The paper describes a multimodal guidance system whose aim is to improve
manual skills of people with specific disorders, such as Down syndrome, mental
retardation, blind, autistic, etc. The multimodal guidance system provides
assistance in the execution of 2D tasks as for example: sketching, hatching and
cutting operations through haptic and sound interactions. The haptic technology
provides the virtual path of 2D shapes through the point-based approach, while
sound technology provides some audio feedback inputs about his or her actions
while performing a manual task as for example: start and/or finish an sketch;
some alarms related to the hand's velocity while sketching and filling or
cutting operations. Unskilled people use these interfaces in their educational
environment. Keywords: Haptic Guidance; Unskilled People; Sound Interaction | |||
| Identifying Barriers to Accessibility in Qatar | | BIBAK | Full-Text | 235-242 | |
| Erik Zetterström | |||
| To identify barriers to accessibility in Qatar a study was conducted by
distributing a survey to 211 persons with disabilities and conducting
interviews. Lack of awareness, lack of Assistive Technology in Arabic,
inaccessible ATMs and absence of assistive communication services are the
largest barriers. Keywords: statistics; Qatar; accessibility; disabilities | |||
| NCBI and Digital Literacy: A Case Study | | BIBAK | Full-Text | 243-250 | |
| Denise Leahy; Stuart Lawler | |||
| The European Commissioner with responsibility for the Digital Agenda has
declared that she wants to make "Every European Digital" [1] and it is accepted
that knowledge of computing is necessary for everyone in the Information
Society [2] The knowledge and skills which are needed are often called "digital
literacy". The National Council for the Blind of Ireland (NCBI) has provided
training in the use of computers for over 15 years and, in 2010, decided to
take part in the European Computer Driving Licence (ECDL) programme and become
an authorised ECDL test centre. ECDL is a standard of digital literacy which is
accepted in 146 countries and has been taken by over 12 million people. This
paper is a case study of the implementation of the ECDL programme in NCBI. Keywords: Digital literacy; accessibility; ECDL; vision impairment | |||
| A User-Friendly Virtual Guide for Post-Rehabilitation Support Following Stroke | | BIBAK | Full-Text | 251-253 | |
| Sascha Sommer; Matthias Bartels; Martina Frießem; Joachim Zülch | |||
| Post-rehabilitation support aids socio-professional reintegration.
Information about options for post-rehabilitation support following stroke is
provided by an application based on Wiki-principles and semantic technologies
(Virtual Guide). Core feature is a knowledge-management system. Regional health
care professionals contribute initial content for the database. User
involvement is facilitated by an interface based on internet blog posts
describing prototypical situations stroke patients face during
post-rehabilitation. On the condition that sufficient users proactively provide
regular contributions, the platform will, ideally, develop into a living system
representing regional infrastructures for post-rehabilitation support both
accurately and up to date. Keywords: Stroke; semantic technologies; service delivery; socio-professional
reintegration; social innovation | |||
| Musicking Tangibles for Empowerment | | BIBAK | Full-Text | 254-261 | |
| Birgitta Cappelen; Anders-Petter Andersson | |||
| We present a novel approach towards understanding and design of interactive
music technology for people with special needs. The health effects of music are
well documented, but little research and interactive music technology has been
developed, for Music Therapy and health improvement in everyday situations.
Further, the music technology that has been used, exploits little of the
potential current computer technology has to offer the Music and Health and
Music Therapy fields, because it is designed and used based on a narrow
perspective on technology and its potential. We present and argue for a broader
understanding of music technology for empowerment and health improvement,
building on a multidisciplinary approach with perspectives from Tangible
interaction design, empowerment and resource oriented Music Therapy. We call
this approach Musicking Tangibles, inspired by Christopher Small's term
"musicking". We also present two designed Musicking Tangibles, and argue for
their empowering qualities based on user observations. Keywords: Interaction Design; Empowerment; Tangibles; Music; Health | |||
| RHYME: Musicking for All | | BIBA | Full-Text | 262-269 | |
| Harald Holone; Jo Herstad | |||
| This paper describes the RHYME project, aimed at children with multiple disabilities, their families and caregivers. The goal in this cross disciplinary project is to create and evaluate platforms for co-creation through music and physical interaction in order to improve health and well being for the participants. The paper has two main contributions: 1) a review and discussion of Participatory Design in Design for All, and 2) Tangible Interaction and familiarity as a basis for the possibility of musicking for all, for children, their families and caregivers, on individual terms. | |||
| Enhancing Audio Description: A Value Added Approach | | BIBAK | Full-Text | 270-277 | |
| Jack Sade; Komal Naz; Malgorzata Plaza | |||
| Audio Description makes films, shows and TV programs accessible to visually
impaired audience. It is expensive so wide adoption of this technology is not
practical. Canadian Radio-television and Telecommunications Commission requires
that broadcaster describes a minimum of four hours of primetime programming a
week. Production companies do not see any incentives to move beyond the
required minimum. This paper investigates the possibility of making AD
profitable by making a described movie, show or program attractive to all kind
of audiences including visually impaired. We argue that AD can become a revenue
generation product widely adopted by production companies. Keywords: AD; Business Analysis | |||
| Triple Helix -- In Action? | | BIBAK | Full-Text | 278-283 | |
| Niels Henrik Helms; Susanne Tellerup | |||
| This paper presents a project i-Space about learning and playful
applications, which could also document performance. The target group is
mentally impaired citizens. The project is used as reference to a discussion on
structures within innovation processes. This discussion leads to a discussion
of the user as a sense-making category in multi-disciplinary settings. Keywords: Innovation; Triple Helix; quadrant model; user categories | |||
| Virtual User Models for Designing and Using of Inclusive Products: Introduction to the Special Thematic Session | | BIBA | Full-Text | 284-287 | |
| Yehya Mohamad; Manfred Dangelmaier; Matthias Peissner; Pradipta Biswas; et al | |||
| This STS on Virtual User Models for designing and using of inclusive products is targeted towards generic interoperable user models that describe the relevant characteristics of users, who will interact with products and user interfaces. A user profile is an instantiation of a user model representing either a specific user or a representative of a group of users [1]. With such a model designers can define as many user profiles as needed to address the whole range of requirements from a target population in order to maximize the level of accessibility of products and services according to the selected user profile. The papers in this STS address many of the issues addressed by the VUMS cluster of projects. The cluster is formed by four projects funded by the European Commission under the Theme "FP7-ICT-2009.7.2 Accessible and Assistive ICT"; the projects are VICON, MyUI, GUIDE and VERITAS (http://www.veritas-project.eu/vums/). | |||
| Creative Design for Inclusion Using Virtual User Models | | BIBA | Full-Text | 288-294 | |
| Markus Modzelewski; Michael Lawo; Pierre Kirisci; Joshue O. Connor; Antoinette Fennell; et al | |||
| The development of products that are accessible to the largest possible group of users can be regarded as a major challenge for manufacturers of consumer products. It is therefore crucial, that the product development process is supported by practical methods and tools that can help incorporate these essential human factors in early phases of the development process. Ergonomics evaluation and user testing with real users are user centred design methodologies often conducted by companies that are not only complex, but can be very time and cost-intensive. As an alternative approach virtual user models (VUM) have been proposed for supporting the early phases of the product development process. In this paper we will present the model-based design approach of the European research project VICON supporting inclusive design of consumer products particularly at the early stages of product development. | |||
| A Methodology for Generating Virtual User Models of Elderly and Disabled for the Accessibility Assessment of New Products | | BIBAK | Full-Text | 295-302 | |
| Nikolaos Kaklanis; Konstantinos Moustakas; Dimitrios Tzovaras | |||
| The paper presents a highly novel user modeling framework for the detailed
description of geometric, kinematic, physical, behavioral and cognitive aspects
of users affected by disabilities and elderly. Several aspects of the user's
interaction behavior are examined, while user models are quantified, in terms
of their kinematics and dynamics parameters, in tests with disabled users
through a multisensorial platform, in order to develop accurate and realistic
virtual user models. Hierarchical Task and Interaction Models are introduced,
in order to describe the user's capabilities in multiple scales of abstraction.
The use of alternative ways of a user task's execution, using different
modalities and assistive devices, are also supported by the proposed task
analysis. Keywords: User modeling; UsiXML; virtual user; elderly; disabled; simulation;
accessibility evaluation; ergonomy evaluation | |||
| VERITAS Approach for Parameterization of Psychological and Behavioral Models | | BIBAK | Full-Text | 303-310 | |
| Ana María Navarro; Juan Bautista Mocholí; Juan Carlos Naranjo | |||
| This paper focuses on the description of the approach used to parameterize
the psychological and behavioural user models developed under the FP7 EU
Founded project VERITAS: Virtual and Augmented Environments and Realistic User
Interactions To achieve Embedded Accessibility DesignS. The present paper will
focus on the methodology used to define the relevant psychological and
behavioural parameters within the context of VERITAS. Two complementary
approaches have been selected: on one hand, the use of existing models of the
cognitive architecture Adaptive Control of Thought-Rational (ACT-R) for
cognitive simulation purposes; on the other hand, a second approach based on
existing metrics coming from medical and human behavior studies and biomedical
models. Keywords: Psychological; cognitive models; ACT-R; VERITAS; accessibility; cognitive
architectures; cognitive simulation | |||
| Integration of a Regular Application into a User Interface Adaptation Engine in the MyUI Project | | BIBAK | Full-Text | 311-314 | |
| Alejandro García; Jesús Sánchez; Víctor Sánchez; José Alberto Hernández | |||
| Software development is increasingly focusing its design on users suffering
from different kinds of disabilities or impairments, as it is the case for the
elderly or handicapped people for instance. Real-time adaptable graphical user
interfaces is a promising solution for designing accessible applications for
users with special needs. Essentially, collecting context information and
combining it with information about the user can be used to customize the
content of the interface itself, and so it allows improving the user experience
in his interaction with the application.
The EU-funded FP7 MyUI project has emerged in the adaptive graphical interfaces domain, addressing important barriers which include the developers' lack of awareness and expertise, time and cost requirements of incorporating accessibility and missing validated approaches and infrastructures of accessible software design. This paper presents the technology used and the experiences collected in the integration process of a regular application into such a framework. Keywords: Adaptation; user interfaces; accessibility; elderly; disabilities; user
profiling | |||
| Using Annotated Task Models for Accessibility Evaluation | | BIBAK | Full-Text | 315-322 | |
| Ivo Malý; Jiri Bittner; Pavel Slavík | |||
| Evaluation of application accessibility is a challenging task that requires
an intensive testing with potential application users. An alternative to user
tests is the model based testing using simulations. The simulations provide
important feedback about application accessibility particularly when it is hard
to involve the target users in the tests which is often the case for users with
disabilities. In this paper we propose a methodology of providing the quickly
and easily necessary data for the simulations. In particular we show how to
annotate task models using application walkthroughs logs that is data obtained
by recording the application usage. We create annotated task models, which
together with the user models are suitable for simulation of application usage
by virtual users with various disabilities. We present tools for recording and
processing of the application walkthrough logs and tools for the interactive
task model annotation. Finally, we provide actual examples of task model
annotation on three scenarios involving the Second Life metaverse. Keywords: Task Models; Accessibility Evaluation; User Centred Design and User
Involvement | |||
| Web Accessibility in Advanced Technologies | | BIBAK | Full-Text | 323-324 | |
| Shadi Abou-Zahra; Konstantinos Votis; Karel Van Isacker | |||
| The Web is rapidly evolving and converging with other media and
technologies. Today the Web is on mobile devices, televisions, self-service
terminals, and computer desktops. It is continuing to be increasingly
ubiquitous and indistinguishable from other interfaces and became an ambient
part of our daily lives, particularly with the advancement of "the cloud".
Thus, there is a need for developers and designers to better understand the
relationship and overlap of the existing accessibility methodologies, and
introduce Web accessibility in advanced and mainstream technologies for
providing accessible products that work better for people who experience
difficulties and changes in their abilities due to aging. Keywords: Web; Accessibility; Ubiquitous Web; Cloud Computing; Digital TV; People with
Disabilities; Aging Population | |||
| The eAccess+ Network: Enhancing the Take-Up of eAccessibility in Europe | | BIBA | Full-Text | 325-328 | |
| Klaus Miesenberger; Eric Velleman; David Crombie; Helen Petrie; Jenny S. Darzentas; et al | |||
| This short paper introduces the idea, the main tool and the work of the EU-supported eAccess+ network (www.eaccessplus.eu) for fostering the uptake of eAccessibility in Europe. The rationale for the network starts from the fact that a considerable and elaborated body of knowledge, established in the eAccessibility and Assistive Technology domain, exists but is rarely implemented in mainstream design. There are many reasons for this situation and the network is working to identify and address them and to start processes to remedy the situation. | |||
| A Method for Generating CSS to Improve Web Accessibility for Old Users | | BIBAK | Full-Text | 329-336 | |
| Jesia Zakraoui; Wolfgang Zagler | |||
| We propose a method to improve Web Accessibility. First, we generate a list
of Cascading Style Sheet CSS for Websites depending on user's needs and
meaningful contextual information. Second, we rank this list in order to best
fit with the current user. In order to provide means for that, formally
connected knowledge in user interaction processes are used to support a
reasoning unit, which is based on Answer Set Programming (ASP). Finally, visual
aspects of user interfaces such as sizes of user interface elements, colours,
relative position of the elements or navigation devices are specified. In Web
environments, user interface adaptation is needed to tailor user interfaces to
older people's needs and impairments while preserving their independence. Keywords: Ontology; Answer Set Programming; Default knowledge; Web Accessibility;
Cascading style sheet; Context; User interaction | |||
| Implementing Web Accessibility: The MIPAW Approach | | BIBAK | Full-Text | 337-342 | |
| Jean-Pierre Villain; Olivier Nourry; Dominique Burger; Denis Boulay | |||
| This paper presents the elaboration of a model for a progressive
implementation of WCAG, centered on the notions of access to information and
essential users' needs. MIPAW's main goal is to serve as a framework for the
elaboration of gradual implementation methodologies, of systems measuring the
real level of accessibility, and the setting up of efficient quality assurance
management systems. It is based on state of the art, real-world experience, and
expertise in accessibility as well as quality assurance. The project aims at
providing methodological tools better suited to the constraints of web
industrialization, while preserving the deployment of real user-centric
accessibility. MIPAW is a project lead as part of the activities of the
AccessiWeb GTA (Workgroup on Accessibility), and has received active support
from 16 of the most prominent French companies in the area of expertise in
digital accessibility. Keywords: WCAG; AccessiWeb; Accessibility; Progressive Enhancement; User centric;
Design for All; Quality Assessments; Access to information; Accessibility
Barrier | |||
| Accessibility of Dynamic Adaptive Web TV Applications | | BIBAK | Full-Text | 343-350 | |
| Daniel Costa; Nádia Fernandes; Carlos Duarte; Luís Carriço | |||
| In the last years, TVs have become platforms providing content and
entertainment services, such as video on demand, interactive advertising or
social networking. Often, these services are Web based applications that run of
connected TVs or set-top boxes. Given TV's wide reach, it is paramount TV
applications are designed so that information can be perceived by everyone,
i.e. should be accessible. These applications increasingly present dynamic
aspects, which have been rendering traditional Web evaluation approaches
obsolete. Additionally, TV based interaction has specificities that Web based
evaluation is unable to cope with. In this paper, we present an automated
accessibility evaluation framework to address these challenges. It is based on
WCAG 2.0 and Digital TV guidelines. It supports evaluation of the code after
browser processing and scanning the whole set of application states. It is
capable of evaluating user interface adaptation based on selected user
profiles. The paper also presents the evaluation results of three TV based
applications according to the proposed framework, which allow a comparison of
results of pre and post browser processing as well as pre and post adaptation. Keywords: Web Accessibility; Web TV applications; Automated Evaluation; Rich Internet
Applications | |||
| Ontology Based Middleware for Ranking and Retrieving Information on Locations Adapted for People with Special Needs | | BIBAK | Full-Text | 351-354 | |
| Kevin Alonso; Naiara Aginako; Javier Lozano; Igor G. Olaizola | |||
| Current leisure or touristic services searching tools do not take into
account the special needs of large amount of people with functional
diversities. However, the combination of different semantic, web and storage
technologies make possible the enhancement of such search tools, allowing more
personalized searches. This contributes to the provision of better and more
suitable results. In this paper we propose an innovative ontology driven
solution for personalized tourism directed to people with special needs. Keywords: information retrieval; ontology; special needs | |||
| Automatic Color Improvement of Web Pages with Time Limited Operators | | BIBAK | Full-Text | 355-362 | |
| Sébastien Aupetit; Alina Mereuta; Mohamed Slimane | |||
| Accessibility is unfortunately not among the main concern when developing
web sites. Webmasters create mostly involuntarily numerous obstacles for people
with visual impairments. That's why it becomes fundamental to identify the
existing barriers and to propose solutions in order to at least diminish their
impact to the user. Accessibility guidelines, as WCAG 2.0, indicate that a
minimum difference of brightness, tonality and contrast is necessary to reach a
minimum level of accessibility. In numerous cases, web designers ignore or just
limit their choices to a low level of accessibility. For an user needing a
higher level of accessibility than the one offered by the web page, the access
to information may be difficult. In this context, we propose to transform the
colors of web pages according to user's needs with the help of a client-side
HTTP proxy. The requirements for the colors can be expressed as a fitness
function. In order to recolor the page to increase accessibility, it's enough
to minimize the fitness function.
Trying to find a minimum can be a time consuming task not appropriate for real time recoloring. Finding a minimum can be considered as a search with varying time limits. In this article, our objective is to compare different search methods and their performance under time limit: the search can be interrupted at any time. The studied methods are a random search, different types of pseudo gradient descend and an adaptation of the API metaheuristic. Finally, the different methods are compared. Keywords: accessibility; assistive technology; recoloring; web; optimization | |||
| Improving Web Accessibility for Dichromat Users through Contrast Preservation | | BIBAK | Full-Text | 363-370 | |
| Alina Mereuta; Sébastien Aupetit; Mohamed Slimane | |||
| Unfortunately, accessibility is not one of designers priorities while
developing web sites, resulting in barriers for numerous disabled users. In
this context, it is fundamental to identify the difficulties they may
experience while surfing web and to propose solutions in order to remove them
or diminish their impact. The choice of colors is far from being a random
process but often a way to transmit or emphase information. This is
particularly true for textual information contained in a web page. The
perception of colors by a dichromat user is different. This results in a loss
of the information conveyed by color. In our study, we show that there is a
significant loss of contrast for a dichromat user resulting in information
loss. We propose a method based on a mass-spring simulation to modify the
colors with aim to enforce similar contrast for dichromat users. Tests on
several websites allow us to conclude that our method significantly reduce the
loss of contrast for both protanope and deuteranope users. Keywords: assistive technology; accessibility; dichromacy; web sites; contrast
preservation | |||
| Sociological Issues of Inclusive Web Design | | BIBAK | Full-Text | 371-377 | |
| Michael Pieper | |||
| The German BIENE award (Barrierefreies Internet Eröffnet Neue
Einsichten / Accessible Internet Provides New Insights), a best practice
competition for accessible websites organized by the social association "Aktion
Mensch" and the endowment "Digitale Chancen" enters into a new competitive
phase. For the 2010 competition 224 web pages have been checked for their
barrier free accessibility. Web applications that facilitate interactive
sharing of user generated content are of particular importance, when it comes
to Web 2.0 technologies. In this respect it soon turned out, that Web 2.0
services cannot only be made accessible by applying common design guidelines
and ad-hoc adaptations. In addition to conventional software ergonomic
verification procedures, accessibility validation has to rely on sociological
reasoning about unique Web 2.0 entities and corresponding usage obstacles.
Empirically these considerations have been conceptualized by an online survey
amongst 671 respondents with all kinds of different disabilities, carried out
by "Aktion Mensch". Keywords: Accessibility; Usability; Human-Computer Interaction; Web 2.0. | |||
| Online Shopping Involving Consumers with Visual Impairments -- A Qualitative Study | | BIBAK | Full-Text | 378-385 | |
| Elisabeth Fuchs; Christine Strauss | |||
| Despite the general popularity of online shopping, its usage is not entirely
granted to all user groups. In this context, especially consumers with visual
impairments are often faced with challenging barriers. To provide a better
understanding of their actual needs and to identify experienced difficulties,
personal in-depth interviews were conducted with visually impaired users. The
obtained results of this empirical qualitative study form a knowledge base of
consumer insights, which can be further used as a source for target-group
specific improvements and innovations. Keywords: visual impairment; online shopping; web accessibility; e-inclusion consumer
research; qualitative study | |||
| Website Accessibility Metrics: Introduction to the Special Thematic Session | | BIBAK | Full-Text | 386-387 | |
| Shadi Abou-Zahra | |||
| In many situations it is useful to measure the level of accessibility of
websites using a more continual scale rather than the rather limited set of
four ordinal values (none, A, AA, and AAA) proposed by the W3C/WAI Web Content
Accessibility Guidelines (WCAG). For example, a continual scale allows more
granular benchmarking of websites to compare them or to help assess
improvements made over time. However, finding reliable metrics is a non-trivial
challenge for a variety of reasons. This paper introduces a Special Thematic
Session to explore this challenge, further to a previously held online
symposium of the W3C/WAI Research and Development Working Group (RDWG). Keywords: Web Accessibility; Accessibility Metrics; Benchmarking; Quality Assurance;
Web Content Accessibility Guidelines (WCAG) | |||
| Integrating Manual and Automatic Evaluations to Measure Accessibility Barriers | | BIBAK | Full-Text | 388-395 | |
| Paola Salomoni; Silvia Mirri; Ludovico A. Muratori; Matteo Battistelli | |||
| Explicit syntax and implicit semantics of Web coding are typically addressed
as distinct dominions in providing metrics for content accessibility. A more
down-to-earth portrait about barriers and their impact on users with
disabilities could be obtained whether any quantitative synthesis about number
and size of barriers integrated measurements from automatic checks and human
assessments. In this work, we present a metric to evaluate accessibility as a
unique measure of both syntax correctness and semantic consistence, according
to some general assumptions about relationship and dependencies between them.
WCAG 2.0 guidelines are used to define boundaries for any single barrier
evaluation, either from a syntactic point of view, or a subjective/human one.
In order to assess our metric, gathered data form a large scale accessibility
monitor has been utilized. Keywords: Web accessibility metrics; Web accessibility barriers; accessibility
evaluation | |||
| Assessing the Effort of Repairing the Accessibility of Web Sites | | BIBAK | Full-Text | 396-403 | |
| Nádia Fernandes; Luís Carriço | |||
| The paper presents a new metric and a framework to assess the effort of
repairing the accessibility of a Web site. For that all the HTML elements of
all the pages of a site are considered, excluding those that are duplicated.
The rationale is that those elements are originated in a reusable construct,
such as a template and, therefore, need to be corrected only once. The
evaluation then ap-plies the accessibility evaluation techniques on those
elements instead of on all the instances that are presented to the user. The
reported fails and warnings are then computed in a simple sum metric.
The paper also describes the validation experiment of both metric and framework, providing very important results. These may well contribute to a different perspective from managers and development team leaders about the effort to revamp the accessibility of a site. Keywords: Web Accessibility; Templates; Automated Evaluation; Metrics | |||
| Lexical Quality as a Measure for Textual Web Accessibility | | BIBA | Full-Text | 404-408 | |
| Luz Rello; Ricardo Baeza-Yates | |||
| We show that a recently introduced lexical quality measure is also valid to measure textual Web accessibility. Our measure estimates the lexical quality of a site based in the occurrence in English Web pages of a set of more than 1,345 words with errors. We then compute the correlation of our measure with Web popularity measures to show that gives independent information. This together with our previous results implies that this measure maps to some of the WCAG principles of accessibility. | |||
| Accessibility Testing of a Healthy Lifestyles Social Network | | BIBAK | Full-Text | 409-416 | |
| Cecília Sík Lányi; Eszter Nagy; Gergely Sik | |||
| The current development of the Internet and its growing use makes it
necessary to satisfy the needs of all users including those with disabilities
having accessibility problems. The healthy lifestyle is increasingly important
to people. The number of webpages dealing with healthy lifestyles is growing.
"Webstar" healthy lifestyle social network was tested by Wave Toolbar, HTML
Validator, Web Developer Toolbarand WCAG Contrast Checker. Keywords: social network; validator; WCAG 2.0. | |||
| Following the WCAG 2.0 Techniques: Experiences from Designing a WCAG 2.0 Checking Tool | | BIBA | Full-Text | 417-424 | |
| Annika Nietzio; Mandana Eibegger; Morten Goodwin; Mikael Snaprud | |||
| This paper presents a conceptual analysis of how the Web Content
Accessibility Guidelines (WCAG) 2.0 and its accompanying documents can be used
as a basis for the implementation of an automatic checking tool and the
definition of a web accessibility metric. There are two major issues that need
to be resolved to derive valid and reliable conclusions from the output of
individual tests. First, the relationship of Sufficient Techniques and Common
Failures has to be taken into account. Second, the logical combination of the
techniques related to a Success Criterion must be represented in the results.
The eGovMon project has a lot of experience in specifying and implementing tools for automatic checking of web accessibility. The project is based on the belief that web accessibility evaluation is not an end in itself. Its purpose is to promote web accessibility and initiate improvements. | |||
| Entertainment Software Accessibility: Introduction to the Special Thematic Session | | BIBA | Full-Text | 425-427 | |
| Dominique Archambault; Roland Ossmann | |||
| The kids of the first generation who grew up with computer games are now in their forties, and younger people have been surrounded by more and more devices allowing to use such games. The descendants of our old game stations which were displaying 2 bars on a black and white TV set to play tennis, are now very close to very powerful computers. Games appeared also on websites and mobile phones, while portable game stations allow some amazing visual features. The budgets of some of the major games have reached the level of motion pictures, and a huge number of small games are developed every year. Computer games are now in the heart of the youngsters culture. At the same time one could observe also that a growing part of the population of other age groups are using computer games. Indeed a lot of software application implementing the games the people of these older groups want to play have been designed and became more and more simple to use, while the people of these groups have been familiarised to computer at their work. Therefore it's not rare to see retired people playing scrabble online or card games. | |||
| Assessment of Universal Design Principles for Analyzing Computer Games' Accessibility | | BIBAK | Full-Text | 428-435 | |
| Moyen Mohammad Mustaquim | |||
| Universal design is a significant topic of interest in the research of
accessibility. However, to date there are no certain verification of these
principles on the accessibility issues for computer games. In this paper the
existing universal design principles were verified to assess accessibility in
computer games. Quantitative analysis of collected data showed that some design
principles are not really optimal for assessing computer games' accessibility
while other design principles were overlooked. The findings from this study
take the argument of alternation of existing universal design principles
further ahead and initializes the possibilities of developing accessible games
design principles. Keywords: Accessibility in Games; Universal Design; Design Principles for Accessible
Games Design; Inclusive Games Design | |||
| One Way of Bringing Final Year Computer Science Student World to the World of Children with Cerebral Palsy: A Case Study | | BIBAK | Full-Text | 436-442 | |
| Isabel M. Gómez; Rafael Cabrera; Juan Ojeda; Pablo García; Alberto J. Molina; et al | |||
| In this paper, a learning project is explained which is being carried out at
the school of computer science at the University of Seville. The aim is that
students receive knowledge of assistive technologies when in fact there is no
this discipline in our curricula. So the best way, it is programming final
studies projects in this field. We want to make the projects have a real
application and can solve difficulties that children with Cerebral Palsy have
in their daily activities in the school. Keywords: serious games; training in assistive technologies; access device | |||
| Making the PlayStation 3 Accessible with AsTeRICS | | BIBAK | Full-Text | 443-450 | |
| Roland Ossmann; David Thaller; Gerhard Nussbaum; Christoph Veigl; Christoph Weiß | |||
| People with mobility disabilities can hardly play any of the mainstream
computer and video games. For most of them, special developed games are the
only chance to play games. So, playing together with friends or the family is
only possible on a very limited way.
Within the Project AsTeRICS, a flexible and affordable construction set for the implementation of user driven assistive technologies solutions will be developed. This allows the combination of different sensors to process and manipulate the sensor data to control any supported device. This paper will show, how a Sony PlayStation 3 can become the supported device, and how the requirements of a mainstream game can be tailored to the possibilities of a disabled person. Furthermore, possible limitations of this solution will be discussed. Keywords: Assistive Technology; Games Accessibility; Alternative Game Control | |||
| Creating an Entertaining and Informative Music Visualization | | BIBA | Full-Text | 451-458 | |
| Michael Pouris; Deborah I. Fels | |||
| Auditory music is a universal art form that has spanned millennia. Music provides an insight into the collective culture of a society and acts as a vehicle to transmit shared knowledge that is common to all members of society. People who are deaf, deafened or hard of hearing tend to have a limited access to music and as a result can be excluded from this shared knowledge and cultural experience. A music visualization system, MusicViz, was developed based on a model of audio-visual sensory substitution. An evaluation of six different music genres showed that the visualizations were enjoyable and able to convey some information and emotions to the participants. | |||
| Music at Your Fingertips: Stimulating Braille Reading by Association with Sound | | BIBAK | Full-Text | 459-462 | |
| Felix Grützmacher | |||
| Driven by the ongoing integration of computers into the daily lives of blind
people, the reading experience has been undergoing a significant shift from
Braille to synthetic speech. While it is true that speech involves less effort
on the part of the reader, the downside is that it creates the illusion of
completeness of information while in truth many important elements of layout,
punctuation, and spelling are lost. The presentation introduces an application
of Active Tactile Control which revolves around the medium of music and is
designed in such a way that students can only succeed if they mentally
translate auditive impressions into Braille characters. Keywords: MusikBraille; Learning; Didactic; Active Tactile Control; Braille music
notation; software; synaesthesia; auditive feedback; editor | |||
| Improving Game Accessibility with Vibrotactile-Enhanced Hearing Instruments | | BIBA | Full-Text | 463-470 | |
| Bernd Tessendorf; Peter Derleth; Manuela Feilner; Daniel Roggen; Thomas Stiefmeier; et al | |||
| In this work we present enhanced hearing instruments (HIs) that provide vibrotactile feedback behind the user's ears in parallel to sound. Using an additional feedback modality we display dedicated vibrotactile patterns to support the user in localizing sound sources. In a study with 4 HI users and 5 normal hearing participants we deploy the system in a gaming scenario. The open source availability of the mainstream 3D first person shooter game used in the study allowed us to add code for accessibility. We evaluate the system qualitatively with user questionnaires and quantitatively with performance metrics calculated from statistics within the game. The system was perceived as beneficial and allowed the HI users to achieve gaming performance closer to that of normal hearing participants. | |||
| An OCR-Enabled Digital Comic Books Viewer | | BIBAK | Full-Text | 471-478 | |
| Christophe Ponsard; Ravi Ramdoyal; Daniel Dziamski | |||
| The generalisation of user-friendly and mobile interfaces like smart phones,
eBook readers and tablets has accelerated the transition of comic books to the
digital format. Although such user interfaces are not always fit for use by
people with special needs, the underlying platform offers a large number of
innovative services which opens a wide spectrum of new possibilities for
enhancing accessibility.
This paper explores how these new technologies can improve the digital access to comic books. Our main contribution is the inclusion of optical character recognition within text bubble associated to comics characters. The recognised text can then be fed into a text-to-speech engine for an improved experience. We also details performance improvements of other functionalities such as the panel order detection and special backgrounds. Finally, we discuss how these application specific adaptations can be applied to other contexts and which kind of future deployment can be anticipated. Keywords: comics; accessibility; motor-impaired; low-sighted; mobile users; image
processing; cloud; OCR; text-to-speech | |||
| Spe-Ler: Serious Gaming for Youngsters with Intellectual Disabilities | | BIBA | Full-Text | 479-483 | |
| Joan De Boeck; Jo Daems; Jan Dekelver | |||
| When working with youngsters with intellectual disabilities, it is often a challenge to teach them 'boring' content (e.g. the 'rules of daily living' in their school or care-center). In this paper we propose a serious gaming approach in order to facilitate the learning process. The novelty in our concept is that we decouple the game and the didactical content, which allows us to transfer the learning to the youngster's leisure time. In our research, we built a framework containing several (fun) games and an administration environment that facilitates the creation of learning content. In a user experiment, measuring the user's joy and motivationwe found that the subjects enjoyed playing the games and were very attentive when the didactical content appeared. | |||
| An Accessibility Checker for LibreOffice and OpenOffice.org Writer | | BIBAK | Full-Text | 484-491 | |
| Christophe Strobbe; Bert Frees; Jan Engelen | |||
| OpenOffice.org Writer and LibreOffice Writer both implement the OpenDocument
Format (ODF) and support output formats such as PDF and XHTML. Through the
extensions odt2daisy and odt2braille (developed in the context of the AEGIS
project) Writer can also export to DAISY (audio books) and Braille. In order to
output usable DAISY or Braille, authors first need to create an accessible
source document. The objective of AccessODF, the accessibility checker
developed in the context of the European AEGIS project, is to support authors
in creating accessible ODF documents and to prepare these documents for
conversion to DAISY and/or Braille. The paper discusses the user interface
options that were explored, describes how authors can repair errors and
warnings, gives examples of automatic and semi-automatic repairs supported by
the checker, and describes which errors and warnings are implemented. Keywords: Accessibility; accessibility evaluation; office documents; OpenOffice. org;
LibreOffice; Evaluation and Report Language (EARL) | |||
| Visualization of Non-verbal Expressions in Voice for Hearing Impaired | | BIBAK | Full-Text | 492-499 | |
| Hidetaka Nambo; Shuichi Seto; Hiroshi Arai; Kimikazu Sugimori; Yuko Shimomura; et al | |||
| Generally, a hearing impaired person is supported by staffs to take a note
while hearing a lecture. However, the lecture note cannot express a tone of the
teacher's voice. Further, non-verbal information such as a chatting voice in a
classroom, speed, loudness and tone of speaker's voice are also difficult to
express. As a result, it is difficult for a hearing impaired person to feel the
atmosphere in the classroom. In this study, we develop a system to inform
atmosphere in the classroom to a hearing impaired person. The system utilizes
expression techniques used in Japanese cartoons; they are "Ambient Font",
"Balloon & Symbols" and "Onomatopoeic Word". These techniques enable us to
inform to the hearing impaired person not only the textual information but also
the non-verbal information. Keywords: Hearing impaired; non-verbal expressions; Onomatopoeia | |||
| XML-Based Formats and Tools to Produce Braille Documents | | BIBAK | Full-Text | 500-506 | |
| Alex Bernier; Dominique Burger | |||
| The production of high quality Braille documents is time consuming because
it often involves a lot of manual work to be done on the text. To increase the
global number of Braille documents available to end-users, special efforts have
to be done to automate as most as possible the production processes. At the
mean time, the documents quality should not decrease, because Braille is often
used in learning situations where errors are harmful for the users. This paper
will present recent advances and current developments made in the field of
Braille production. Especially, XML-based formats useful to create complex
Braille documents will be introduced. Next, some tools operating on these
formats will be described, and finally, we will underline the need and the
possibility to create fully integrated production workflows based on these
tools and formats. Keywords: accessible publishing; Braille; DAISY; ebooks; EPUB; PEF; print-disabled
persons; scientific documents; workflow; XML | |||
| Japanese Text Presentation System for Pupils with Reading Difficulties | | BIBAK | Full-Text | 507-514 | |
| Shinjiro Murayama; Kyota Aoki | |||
| There are many pupils with reading difficulty in Japanese schools. The
dyslexia is the disability about reading and writing texts. We use Kanji,
Hiragana, Katakana characters in Japanese sentences. We propose the Japanese
text presentation system that eases the difficulties about reading Japanese
texts with or without dyslexia. The Kanji is an ideograph. The Hiragana and the
Katakana are phonograms. The reading difficulties include 2 types. One is a
difficulty about reading the Kanji. Another is the difficulty about tracing the
reading sequence. This paper proposes a system that presents the Japanese
sentences with suitable presentation method for each pupil with reading
difficulties. The main function of the proposed system is 3 levels of
highlighting/masking that are independently controlled. The highlighting only
is not enough to prevent the error about the reading sequence of character
chunks. The 3 level highlighting/masking enables to adapt the presentation to
wide varieties of reading difficulties. This paper proposes the design and the
experiments of the Japanese text presentation system on the students without
reading difficulty. Keywords: Reading difficulty; Text presentation; Highlighting/Masking; Dyslexia | |||
| Development of a DAISY Player That Utilizes a Braille Display for Document Structure Presentation and Navigation | | BIBAK | Full-Text | 515-522 | |
| Kazunori Minatani | |||
| From the perspective of assistive Technology, the hierarchical document tree
structure is particularly relevant to represent a document's logical structure.
This research proposes a way of realizing advantages attainable from making use
of the logical structure of documents by developing a method of presenting the
tree structure information of a document on a braille display. The document
browser software developed for this research operates as a DAISY player.
Experimentation found that using a user interface of that document browser
software improves the efficiency of understanding the document's general
structure and finding headings when compared to the user interface of a
conventional DAISY player with numeric keypad cursor navigation. Not just the
DAISY contents, the proposed user interface can be used for general-purpose
applications. Keywords: Blind person; Multi Modal Interface; Document Structure Presentation and
Navigation; Braille Display; DAISY Player | |||
| Acce-Play: Accessibility in Cinemas | | BIBAK | Full-Text | 523-526 | |
| Alexandre Paz; Mari Luz Guenaga; Andoni Eguíluz | |||
| In this paper we present Acce-Play: a system that aims to provide accessible
content to all life cycle of films. It is designed to be platform independent
and currently allows to play accessible content in cinemas. The content is
synchronized with the playing film using audio fingerprinting techniques with
the projector audio stream. Keywords: Accessibility; Cinema; Audio Fingerprinting; Audiovisual Accessibility | |||
| Automatic Simplification of Spanish Text for e-Accessibility | | BIBAK | Full-Text | 527-534 | |
| Stefan Bott; Horacio Saggion | |||
| In this paper we present an automatic text simplification system for Spanish
which intends to make texts more accessible for users with cognitive
disabilities. This system aims at reducing the structural complexity of Spanish
sentences in that it converts complex sentences in two or more simple sentences
and therefore reduces reading difficulty. Keywords: Automatic Text Simplification; Natural Language Processing; e-Accessibility | |||
| Can Computer Representations of Music Enhance Enjoyment for Individuals Who Are Hard of Hearing? | | BIBAK | Full-Text | 535-542 | |
| David Fourney | |||
| Music is an art form present in all cultures and a shared experience. People
who are Deaf, Deafened, or Hard of Hearing (D/HOH) do not have full access to
the music of the larger hearing cultures in which they live. As a consequence,
access to this shared experience and the cultural knowledge it contains is
lost. As a result of an increasingly aging global population the number of
D/HOH people is growing creating a consumer need for improved access to music
information. Challenging the notion that music is only something that can be
heard, this paper reviews the state of the art for supporting D/HOH music
consumers and describes a study conducted with HOH music consumers to determine
how best to support their needs. Results show that HOH people have several
difficulties accessing music. Keywords: Music; Deaf; Hard of Hearing; visualisation | |||
| Assistive Photography | | BIBAK | Full-Text | 543-549 | |
| Ludek Bártek; Ondrek Lapácek | |||
| Many people make photographs of places they visited and when they are
browsing the collections they can not often remember the names of buildings on
the pictures. There also exist people with visual impairment interested in a
photography[1].
This paper deals with the algorithms and methods they can allow people with visual impairment to photograph. They allow to automatically add a semantic description of buildings on a photography and to browse the collection of photographs taken this way even by visually impaired users using the semantic description. Keywords: visual impairment; photography; geolocation; semantic description | |||
| The LIA Project -- Libri Italiani Accessibili | | BIBAK | Full-Text | 550-553 | |
| Cristina Mussinelli | |||
| The LIA Project -- Libri Italiani Accessibili is a biennial project started
in 2011. It aims at providing a service to increase availability on the market
of digital publications accessible to blind and visually impaired, in full
respect of the rights of authors and publishers. Keywords: Digital publications; e-book; accessible; accessibility; EPUB; mainstream;
blind; visually impaired | |||
| Inclusion by Accessible Social Media | | BIBA | Full-Text | 554-556 | |
| Harald Holone | |||
| Social Media has great promise for facilitation of inclusion and participation for all. With this Special Topic Session, we wanted to address two perspectives on social media and inclusion: accessibility to social media on various device configurations, and inclusion through use of and engagement in social media. The papers in this STS falls into two broad categories. Three of four papers mostly look at the accessibility of social media, either with design guidelines, methodological considerations or surveys as central contributions. The fourth paper looks more closely at a case where computers and multimedia is used rehabilitation studies. This introduction provides a short introduction to social media and technology development, the scope of the STS, and a summary of the included papers. | |||
| The Use of Multimedia to Rehabilitate Students and Release Talents | | BIBAK | Full-Text | 557-564 | |
| Luciana Maria Depieri Branco Freire | |||
| The use of new information and communication technologies can improve
learning with dynamic, creative strategies. New knowledge will be obtained by
exercising the mind, i.e., using the two cerebral hemispheres by
neuroplasticity, in a dynamic, intense and active way. It is necessary to show
that computer can be used as a means of exercising the mind through different
activities and can be also used in pedagogical practices with the purpose of
making learning easier. It offers ways that are alternative to those offered by
school for students, with or without special necessities, to develop their
capacities and potentialities. The computer can be used to develop several
activities, which are complex and allow the development of many abilities that
help in the solution of problems and make students learn more from their
mistakes. These activities will help students develop self-confidence and
improve their creative actions and be independent. Keywords: Education; Inclusive Education; Cerebral Exercise; Multimedia | |||
| Use of Social Media by People with Visual Impairments: Usage Levels, Attitudes and Barriers | | BIBAK | Full-Text | 565-572 | |
| Kristin Skeide Fuglerud; Ingvar Tjøstheim; Birkir Rúnar Gunnarsson; Morten Tollefsen | |||
| Social medias are a central arena for participation, in social life,
politics, business and working life. This paper aims to document the social
media use among people with visual impairments (VI) in Norway, and to explore
some barriers and motivational factors to the use of social media for this
group. We present results from two surveys about social media usage among
people with VI. One telephone survey was conducted among 150 members of the
Norwegian Association of the Blind and Partially Sighted (NABP). This survey
contained questions about social media usage. The results from this
quantitative survey are discussed in light of results from a web survey with
more open-ended questions. The web survey was about how disabled people in
Norway use social media, and what accessibility and usability challenges they
experience. Through the web survey informants brings to the surface some
important accessibility issues and adds nuances to the overall picture. While
the telephone survey shows that a high percentage of people with VI participate
in social media, the web-based survey indicate they face a variety of problems
and typically use the core functionality only. Together, these two surveys give
a broad picture of social media usage among people with visual impairments in
Norway. Keywords: universal design; accessibility; visually impaired; social media; social
networking sites; assistive technology; security barriers; Captcha; surveys | |||
| User Testing of Social Media -- Methodological Considerations | | BIBAK | Full-Text | 573-580 | |
| Oystein Dale; Therese Drivenes; Morten Tollefsen; Arthur Reinertsen | |||
| The use of social media has in recent years increased dramatically. It is
imperative that social media are accessible to all. To ensure this, it is
important to conduct user testing as part of an accessibility and usability
assessment of social media services. This paper focuses on the methodology
applied in such undertakings, and its purpose is to draw attention to important
aspects that should guide user testing and user studies of social media
services. This is done by sharing the experiences gained in the project Net
Citizen. The main target groups for the paper are those planning the
implementation of social media services and those who conduct accessibility and
usability user testing. Key findings are that cumulative usability issues can
be likened to poor accessibility. Further, that web services that are
accessible in a strict technical sense, may not necessarily be perceived as
accessible by real users. Keywords: Social media; accessibility; usability; user testing; methodology | |||
| Designing User Interfaces for Social Media Driven Digital Preservation and Information Retrieval | | BIBAK | Full-Text | 581-584 | |
| Dimitris Spiliotopoulos; Efstratios Tzoannos; Pepi Stavropoulou; et al | |||
| Social Media provide a vast amount of information identifying stories,
events, entities that play the crucial role of shaping the community in an
everyday heavy user involvement. This work involves the study of social media
information in terms of type (multimodal: text, video, sound, picture) and role
players (agents, users, opinion leaders) and the potential of designing
accessible, usable interfaces that integrate that information. This case
examines the design of a user interface that uses an underlying engine for
modality components (plain text, sound, image, video) analysis, social media
crawling, contextual search fusion and semantic analysis. The interface is the
only point of user interaction to the world of knowledge. This work reports on
the usability and accessibility methods and concerns for the user requirements
phase and the design control and testing. The findings of the pilot user
testing and evaluation provide indications on how the semantic analysis of the
social media information can be integrated to the design methodologies for user
interfaces resulting in maximization of user experience in terms of social
information involvement. Keywords: social media; user interface design; user enablement | |||
| PDF/UA -- A New Era for Document Accessibility: Understanding, Managing and Implementing the ISO Standard PDF/UA (Universal Accessibility): Introduction to the Special Thematic Session | | BIBAK | Full-Text | 585-586 | |
| Olaf Drümmer; Markus Erle | |||
| Short introduction to the Special Thematic Session about the new ISO
standard for PDF accessibility and how PDF/UA changes the game for document
software developers, assistive technology vendors, decision-makers,
organizations in the public and private sector, accessibility experts,
publishers, authors and last but not least the end-users. Keywords: PDF; WCAG 2.0; PDF/UA; document accessibility; ISO standard; PDF/UA
Competence Center | |||
| PDF/UA (ISO 14289-1) -- Applying WCAG 2.0 Principles to the World of PDF Documents | | BIBAK | Full-Text | 587-594 | |
| Olaf Drümmer | |||
| PDF/UA-1 is an upcoming ISO standard defining accessible PDF. It claims to
apply principles established by W3C's WCAG 2.0 to the world of PDF documents.
This paper discusses a mapping table between WCAG 2.0 Success Criteria and
clauses in the PDF/UA-1 standard to point out, how and why PDF/UA-1 can indeed
be described as an application of WCAG 2.0 principles to PDF. It is to be
expected that this as a consequence will speed up adoption of PDF/UA-1 in the
field of accessible electronic content. Keywords: accessible PDF; tagged PDF; PDF/UA; ISO 14289-1; WCAG 2.0 | |||
| Mainstreaming the Creation of Accessible PDF Documents by a Rule-Based Transformation from Word to PDF | | BIBAK | Full-Text | 595-601 | |
| Roberto Bianchetti; Markus Erle; Samuel Hofer | |||
| axesPDF for Word is an add-in for Microsoft Word 2007 and Word 2010 allowing
to create high quality accessible PDF documents according to guidelines like
WCAG 2.0 and standards like PDF/UA. It is characterized by a specific role
model, a rule based transformation instead of static conversion and the
possibility of n:m-mapping. Even complex documents with elements like
footnotes, side notes, captions, references, indices and glossaries can be made
accessible without post-processing. Keywords: PDF; Microsoft Word; WCAG 2.0; PDF/UA; document accessibility | |||
| Developing Text Customisation Functionality Requirements of PDF Reader and Other User Agents | | BIBAK | Full-Text | 602-609 | |
| Shawn Lawton Henry | |||
| This paper addresses the text customisation needs of people with low vision,
dyslexia, and related conditions that impact reading, including people with
declining eyesight due to ageing. It reports on a literature review and an
initial study that explores the aspects of text that users customize (e.g.,
size, colour, leading, linearization/reflow, and more) for reading RTF and PDF
documents, in operating system settings, and in web browser settings. It
presents the gap between users' needs and PDF user agent (primarily Adobe
Reader) functionality. The existing literature and this exploratory study
indicate that with the technology currently available, PDF is not sufficiently
accessible to many people with low vision, dyslexia, and related conditions
that impact reading. This paper aims to encourage additional text customisation
functionality in Adobe Reader; and to encourage more rigorous studies to
understand, document, and communicate how to better meet users' text
customisation needs through mainstream user agents. Keywords: low vision; dyslexia; readability; adaptability; PDF; Adobe Reader; text
customisation; accessibility guidelines; accessibility standards; user agents | |||
| Using Layout Applications for Creation of Accessible PDF: Technical and Mental Obstacles When Creating PDF/UA from Adobe Indesign CS 5.5 | | BIBAK | Full-Text | 610-616 | |
| Olaf Drümmer | |||
| While substantial progress has been made in widely used applications like
Microsoft Word or Adobe Indesign, when it comes to creating accessible PDF
documents, a number of problems still exist that make it difficult even for
motivated users in a real world production situation to invest additional
effort to create decently tagged PDF. Improved features and enhanced user
interface in these applications could contribute substantially to increase the
likelihood that creators of print-oriented PDF files take the extra work on
them to also make these PDF files accessible. Keywords: tagged PDF; accessible PDF; accessibility; PDF/UA | |||
| Validity and Semantics -- Two Essential Parts of a Backbone for an Automated PDF/UA Compliance Check for PDF Documents | | BIBAK | Full-Text | 617-620 | |
| Markus Erle; Samuel Hofer | |||
| The paper shows why validity and semantics matters for a PDF/UA evaluation
concept and how an automated checking tool can address this. In order to
translate machine-testable requirements into checking criteria a special query
language is developed called PQL (PDF Query Language). PQL will be implemented
in PDF Accessibility Checker PAC 2, the first and free PDF/UA compliance
checker crowd-funded by the foundation "Access for all". Keywords: PDF; WCAG 2.0; PDF/UA; document accessibility; validity; semantics;
checking; PAC2; PDF accessibility checker; foundation "Access for all" | |||
| Two Software Plugins for the Creation of Fully Accessible PDF Documents Based on a Flexible Software Architecture | | BIBAK | Full-Text | 621-624 | |
| Alireza Darvishy; Thomas Leemann; Hans-Peter Hutter | |||
| This paper presents one of two new software plugins for MS PowerPoint and
Word documents which allow the analysis of accessibility issues and
consequently the generation of fully accessible PDF documents. The document
authors using these plugins require no specific accessibility knowledge. This
paper introduces the user interface of the Microsoft PowerPoint accessibility
plugin. The plugins are based on a flexible software architecture concept that
allows the automatic generation of fully accessible PDF documents originating
from various authoring tools, such as Adobe InDesign [1], Word and PowerPoint
[2], [3]. The accessibility plugin software implemented allows authors to check
for accessibility issues while creating their documents and add the additional
semantic information needed to generate a fully accessible PDF document. Keywords: Document accessibility; automatic generation of accessible PDF; screen
reader; visual impairment; accessibility; tagged PDF; software architecture;
PowerPoint and Word documents | |||
| Privacy Preserving Automatic Fall Detection for Elderly Using RGBD Cameras | | BIBAK | Full-Text | 625-633 | |
| Chenyang Zhang; Yingli Tian; Elizabeth Capezuti | |||
| In this paper, we propose a new privacy preserving automatic fall detection
method to facilitate the independence of older adults living in the community,
reduce risks, and enhance the quality of life at home activities of daily
living (ADLs) by using RGBD cameras. Our method can recognize 5 activities
including standing, fall from standing, fall from chair, sit on chair, and sit
on floor. The main analysis is based on the 3D depth information due to the
advantages of handling illumination changes and identity protection. If the
monitored person is out of the range of a 3D camera, RGB video is employed to
continue the activity monitoring. Furthermore, we design a hierarchy
classification schema to robustly recognize 5 activities. Experimental results
on our database collected under conditions with normal lighting, without
lighting, out of depth range demonstrate the effectiveness of the proposal
method. Keywords: Privacy Preserving; Fall Detection; Video Monitoring; Elderly; Activities of
Daily Living | |||
| The Proof of Concept of a Shadow Robotic System for Independent Living at Home | | BIBAK | Full-Text | 634-641 | |
| Lucia Pigini; David Facal; Alvaro Garcia; Michael Burmester; Renzo Andrich | |||
| In the framework of the EU funded SRS (Multi-Role Shadow Robotic System for
independent Living) project, an innovative semi autonomous service robot is
under development with the aim to support frail elderly people at their home.
This paper reports about the user validation of the SRS concept involving 63
potential users of the system coming from Italy, Germany and Spain: in
particular they were frail elderly people, their relatives and 24 hour telecare
professionals. Results confirmed that monitoring and managing emergency
situations as well as helping with reaching, fetching and carrying objects that
are too heavy or positioned in unreachable places are the tasks for which a
robot is better accepted to address users' needs. To support the scenarios
executions and operation modes, the interaction concept should provide three
different interaction devices and modalities for each user group. Keywords: Service robots; tele-operation; elderly people; remote operator; user
requirements; user centered design | |||
| Task Complexity and User Model Attributes | | BIBA | Full-Text | 642-649 | |
| Thomas Grill; Sebastian Osswald; Manfred Tscheligi | |||
| Modeling users in order to design appropriate interfaces and interactions or to simulate a specific user behavior is an ambitious task. When using user model attributes to design an interface as well as its interactions we focus tasks at different levels of complexity. In our work we address the appropriateness of physical, cognitive, behavioral, and psychological attributes and their relevancy for designing and describing tasks at such levels of complexity. We conducted a study that uses tasks of varying complexity levels that we relate to attributes in terms of the categorization previously described. A driving simulator together with a prototype of in-car controls that allows to perform primitive as well as complex tasks during a driving scenario represent the study context and the user interface for the participants who took part in three different scenarios, where they performed selected tasks that have been identified for the automotive area. Further additional workload tasks were used to induce stress and to investigate in the effect of cognitive, behavioral, and psychological attributes. First results show that the physical parameters address mainly primitive tasks. Regarding cognitive, behavioral and psychological parameters, tasks need to be addressed at a more complex level, which was supported by the results of the study. Concluding the relation of primitive tasks to cognitive, behavioral, and psychological attributes is not viable. | |||
| AALuis, a User Interface Layer That Brings Device Independence to Users of AAL Systems | | BIBAK | Full-Text | 650-657 | |
| Christopher Mayer; Martin Morandell; Matthias Gira; Kai Hackbarth; Martin Petzold; et al | |||
| Many ICT services older people could derive a benefit from lack of
accessibility, adoptability and usability of the user interface concerning
arising special needs specific for the target group. AALuis intends to develop
an open User Interface Layer that facilitates a dynamically adapted,
personalized interaction between an elderly user and any kind of service, with
different types of input and output devices and modalities. To achieve this the
AALuis User Interface Layer keeps track of changes of a variety of information
models to adapt the transformation process from abstract task descriptions to a
user interface and to steer the user interaction in a suitable manner. One of
the main goals of AALuis is to create and exploit synergies by developing an
architecture that allows the easy integration into different established AAL
middleware platforms. AALuis aims to significantly contribute to the freedom of
choice for end-users of services and users interfaces. Keywords: AAL; Middleware; User Interaction; User Interfaces | |||
| Comparison between Single-touch and Multi-touch Interaction for Older People | | BIBAK | Full-Text | 658-665 | |
| Guillaume Lepicard; Nadine Vigouroux | |||
| This paper describes a study exploring the multi-touch interaction for older
adults. The aim of this experiment was to check the relevance of this
interaction versus single-touch interaction to realize object manipulation
tasks: move, rotate and zoom. For each task, the user had to manipulate a
rectangle and superimpose it to a picture frame. Our study shows that adults
and principally older adults had more difficulties to realize these tasks for
multi-touch interaction than for single-touch interaction. Keywords: interaction; multi-touch; older people; usability | |||
| Online Social Networks and Older People | | BIBAK | Full-Text | 666-672 | |
| Guillermo Prieto; Denise Leahy | |||
| The number of older people is growing significantly and accounts for an
ever-increasing percentage of the global population [1]. Online social networks
are continuously gaining more relevance and presence in everyday life for
communication, work and social interaction. Despite those trends, there is
little knowledge on how older people use online social networks, and the
benefits derived from it or the possible negative impacts [2], [3]. This paper
examines how older people use online social networks and the factors which
influence this use. Keywords: online social networks; older people; design; accessibility; digital divide;
adoption | |||
| "Break the Bricks" Serious Game for Stroke Patients | | BIBAK | Full-Text | 673-680 | |
| Tamás Dömok; Veronika Szucs; Erika László; Cecília Sík Lányi | |||
| This study introduces a serious game, "Break the Bricks", which is one of
the games planned within the "StrokeBack" project. The aim of this game is to
support the rehabilitation process of stroke patients whom have upper limb
impairments and damaged psychomotor abilities. In this paper we will present
the designing process and the development of the game. We would like to
represent the background of serious games, and the planned test methods of
"Break the Bricks". We will also delineate future plans and further work with
this game. Keywords: serious game; rehabilitation; stroke patients; locomotor disorder | |||
| Development of a Broadcast Sound Receiver for Elderly Persons | | BIBAK | Full-Text | 681-688 | |
| Tomoyasu Komori; Atsushi Imai; Nobumasa Seiyama; Reiko Takou; Tohru Takagi; et al | |||
| With the aim of making speech easier to listen to on a TV receiver, a noble
method for back-ground-sound suppression processing was proposed, and the
results of evaluation tests using broadcast-program sound showed that a
prototype device was able to adjust a suitable level of background sound for
elderly people. Our proposed method was able to suppress the magnitude of sound
components with low correlation by using 2ch stereo signals and perform
gain-control only on the speechless intervals. The preparatory evaluation tests
confirm that it is possible to suitably reduce program background volume by the
proposed method. On the basis of this result, a device for suppressing
background sound by decoding the transport stream (TS) of a broadcast program
was prototyped. The results of evaluation tests using this device demonstrate
that the magnitude of background sound can be adjusted to a suitable level for
elderly people. Keywords: elderly people; phoneme recognition; loudness; stereo correlation;
subjective evaluation; background-sound suppression | |||
| Complexity versus Page Hierarchy of a GUI for Elderly Homecare Applications | | BIBAK | Full-Text | 689-696 | |
| Mustafa Torun; Tim van Kasteren; Ozlem Durmaz Incel; Cem Ersoy | |||
| Using computerized devices comes quite natural for many users due to the
various graphical user interfaces. However, acceptability of graphical user
interfaces by elderly, a rapidly growing group of computer users, is a
challenging issue due to different levels of impairments experienced. In the
literature, providing simplicity is the main focus of the studies that try to
address this challenge. In this paper, we study the acceptance of graphical
user interfaces for elderly people with different impairments in the context of
in-home healthcare systems. We focus on the relation between two main design
parameters of a graphical user interface: page complexity, which is the number
of interface elements on each page and the page hierarchy, which is the number
of the pages to be traced in order to complete a task. For this purpose, we
designed two versions of an interface: one version has a high page complexity
and the other version is designed to have a high page hierarchy. We asked 18
experiment-subjects, aged between 65 and 95, to complete three tasks, using
both versions. Experiment results are evaluated using both objective and
subjective metrics. Results show that the flat version is found to be more
acceptable by elderly. Keywords: Graphical user interface; elderly; acceptance; complexity; hierarchy | |||
| Benefits and Hurdles for Older Adults in Intergenerational Online Interactions | | BIBAK | Full-Text | 697-704 | |
| Verena Fuchsberger; Wolfgang Sellner; Christiane Moser; Manfred Tscheligi | |||
| In order to foster the relationship between geographically distant
grandparents and grandchildren, a prototype of an online platform is developed
in an Ambient Assisted Living project. After identifying relevant attributes in
the requirements analysis together with older adults and experts for children,
we conducted two rounds of user studies in a laboratory setting with older
adults. In the studies we were not only interested in the usability of the
platform and the older participants' computer skills, but especially in the
experiences the older users have when interacting with and via the platform. As
expected, we found a relation between self-rated computer skills and the
usability problems. However, the skills were not decisive for experiencing the
interaction regarding curiosity, engagement, social connectedness and social
presence. Finally, implications for the design of socially connecting online
platforms are presented. Keywords: Older adults; User-Centered Design; Usability; User Experience | |||
| kommTUi: Designing Communication for Elderly | | BIBAK | Full-Text | 705-708 | |
| Wolfgang Spreicer; Lisa Ehrenstrasser; Hilda TellioÄYlu | |||
| Getting older does not mean being merely excluded from digital worlds.
Elderly can at least use the current technology to communicate with their
friends and family members without toiling, on contrary with joy and easiness.
We know this is not true yet. With our research project kommTUi we do our part
to get closer to this goal. In this paper we present our achievement so far.
One of the outcomes is our approach to better design usable and user-sensitive
interaction for elderly. We further show how four design workshops, carried out
in two years, and tangible user interfaces we developed so far can generate and
support playful environments with elderly. We finish our paper with the
presentation of the final model of the new devices we are currently developing
in our project. Keywords: User centered design; technology for elderly; participatory design
workshops; tangible user interface; interaction design | |||
| Reducing the Entry Threshold of AAL Systems: Preliminary Results from Casa Vecchia | | BIBAK | Full-Text | 709-715 | |
| Gerhard Leitner; Anton Josef Fercher; Alexander Felfernig; Martin Hitz | |||
| Ambient assisted living holds promising solutions to tackle the problems of
an overaging society by providing various smart home as well as computing and
internet technologies that support independent living of elderly people.
However, the acceptance of these technologies by the group of elderly
constitutes a crucial precondition for the success of AAL. The paper presents
early results from the project Casa Vecchia which explores the feasibility of
AAL within a longitudinal field study with 20 participating households. Thereby
observed barriers hindering the acceptance of technologies applied in the
project are discussed as well as possible solutions to reduce the entry
threshold to assistive technology. Keywords: Ambient Assisted Living; Technology Acceptance; Ethnographic Fieldstudy | |||
| A Multimodal Approach to Accessible Web Content on Smartphones | | BIBA | Full-Text | 1-8 | |
| Lars Emil Knudsen; Harald Holone | |||
| Mainstream smartphones can now be used to implement efficient speech-based and multimodal interfaces. The current status and continued development of mobile technologies opens up for possibilities of interface design for smartphones that were unattainable only a few years ago. Better and more intuitive multimodal interfaces for smartphones can provide access to information and services on the Internet through mobile devices, thus enabling users with different abilities to access this information at any place and at any time. In this paper we present our current work in the area of multimodal interfaces on smartphones. We have implemented a multimodal framework, and has used it as a foundation for development of a prototype which have been used in a user test. There are two main contributions: 1) How we have implemented W3C's multimodal interaction framework on smartphones running the Android OS, and 2) the results from user tests and interviews with blind and visually impaired users. | |||
| Mobile Vision as Assistive Technology for the Blind: An Experimental Study | | BIBA | Full-Text | 9-16 | |
| Roberto Manduchi | |||
| Mobile computer vision is often advocated as a promising technology to support blind people in their daily activities. However, there is as yet very little experience with mobile vision systems operated by blind users. This contribution provides an experimental analysis of a sign-based wayfinding system that uses a camera cell phone to detect specific color markers. The results of our experiments may be used to inform the design of technology that facilitates environment exploration without sight. | |||
| Camera-Based Signage Detection and Recognition for Blind Persons | | BIBAK | Full-Text | 17-24 | |
| Shuihua Wang; Yingli Tian | |||
| Signage plays an important role for wayfinding and navigation to assist
blind people accessing unfamiliar environments. In this paper, we present a
novel camera-based approach to automatically detect and recognize restroom
signage from surrounding environments. Our method first extracts the attended
areas which may content signage based on shape detection. Then, Scale-Invariant
Feature Transform (SIFT) is applied to extract local features in the detected
attended areas. Finally, signage is detected and recognized as the regions with
the SIFT matching scores larger than a threshold. The proposed method can
handle multiple signage detection. Experimental results on our collected
restroom signage dataset demonstrate the effectiveness and efficiency of our
proposed method. Keywords: Blind people; Navigation and wayfinding; Signage detection and recognition | |||
| The Crosswatch Traffic Intersection Analyzer: A Roadmap for the Future | | BIBAK | Full-Text | 25-28 | |
| James M. Coughlan; Huiying Shen | |||
| The "Crosswatch" project is a smartphone-based system developed by the
authors for providing guidance to blind and visually impaired pedestrians at
traffic intersections. Building on past work on Crosswatch functionality to
help the user achieve proper alignment with the crosswalk and read the status
of Walk lights to know when it is time to cross, we outline the direction
Crosswatch should take to help realize its potential for becoming a practical
system: namely, augmenting computer vision with other information sources,
including geographic information systems (GIS) and sensor data, to provide a
much larger range of information about traffic intersections to the pedestrian. Keywords: visual impairment; blindness; assistive technology; traffic intersection;
pedestrian safety | |||
| GPS and Inertial Measurement Unit (IMU) as a Navigation System for the Visually Impaired | | BIBAK | Full-Text | 29-32 | |
| Jesus Zegarra; René Farcy | |||
| The current GPS (Sirf 3) devices do not give the right heading when their
speed is less than 10 km/h. This heading is also less reliable when the GPS is
used in the big cities where it is surrounded by buildings. Another important
problem is that the change of orientation of the visually impaired needs a long
delay to be detected by the GPS due to the fact that the GPS must reach certain
speed for obtaining the new heading. It can take from 2 seconds to 15 seconds
depending on the GPS signal conditions. In order to avoid these problems, we
have proposed the use of one GPS coupled to the IMU (inertial measurement
unit). This IMU has one 3 axis compass, a one axis gyroscope and one 3 axis
accelerometer. With this system, we can update the heading information every
second. The user Interface is developed in the Smart Phone which gives the
information of heading and distance to the destination. In this paper, we are
also going to describe the advantages of using the heading and distance to the
final destination, updated every second, to navigate in cities. Keywords: GPS; IMU; visually impaired; Smart Phone | |||
| Visual Nouns for Indoor/Outdoor Navigation | | BIBA | Full-Text | 33-40 | |
| Edgardo Molina; Zhigang Zhu; Yingli Tian | |||
| We propose a local orientation and navigation framework based on visual features that provide location recognition, context augmentation, and viewer localization information to a human user. Mosaics are used to map local areas to ease user navigation through streets and hallways, by providing a wider field of view (FOV) and the inclusion of more decisive features. Within the mosaics, we extract "visual noun" features. We consider 3 types of visual noun features: signage, visual-text, and visual-icons that we propose as a low-cost method for augmenting environments. | |||
| Towards a Real-Time System for Finding and Reading Signs for Visually Impaired Users | | BIBAK | Full-Text | 41-47 | |
| Huiying Shen; James M. Coughlan | |||
| Printed text is a ubiquitous form of information that is inaccessible to
many blind and visually impaired people unless it is represented in a
non-visual form such as Braille. OCR (optical character recognition) systems
have been used by blind and visually impaired persons for some time to read
documents such as books and bills; recently this technology has been packaged
in a portable device, such as the smartphone-based kReader Mobile (from K-NFB
Reading Technology, Inc.), which allows the user to photograph a document such
as a restaurant menu and hear the text read aloud. However, while this kind of
OCR system is useful for reading documents at close range (which may still
require the user to take a few photographs, waiting a few seconds each time to
hear the results, to take one that is correctly centered), it is not intended
for signs. (Indeed, the KNFB manual, see knfbreader.com/upgrades_mobile.php ,
lists "posted signs such as signs on transit vehicles and signs in shop
windows" in the "What the Reader Cannot Do" subsection.) Signs provide valuable
location-specific information that is useful for wayfinding, but are usually
viewed from a distance and are difficult or impossible to find without adequate
vision and rapid feedback.
We describe a prototype smartphone system that finds printed text in cluttered scenes, segments out the text from video images acquired by the smartphone for processing by OCR, and reads aloud the text read by OCR using TTS (text-to-speech). Our system detects and reads aloud text from video images, and thereby provides real-time feedback (in contrast with systems such as the kReader Mobile) that helps the user find text with minimal prior knowledge about its location. We have designed a novel audio-tactile user interface that helps the user hold the smartphone level and assists him/her with locating any text of interest and approaching it, if necessary, for a clearer image. Preliminary experiments with two blind users demonstrate the feasibility of the approach, which represents the first real-time sign reading system we are aware of that has been expressly designed for blind and visually impaired users. Keywords: visual impairment; blindness; assistive technology; OCR; smartphone;
informational signs | |||
| User Requirements for Camera-Based Mobile Applications on Touch Screen Devices for Blind People | | BIBAK | Full-Text | 48-51 | |
| Yoonjung Choi; Ki-Hyung Hong | |||
| This paper presents user requirements for camera-based mobile applications
in touch screen devices for blind people. We conducted a usability testing for
a color reading application on Android OS. In the testing, participants were
asked to evaluate three different types of interfaces of the application in
order to identify user requirements and preferences. The results of the
usability testing presented that (1) users preferred short depth of menu
hierarchy, (2) users needed both manual and automatic camera shooting modes
although they preferred manual to automatic mode, (3) the initial audio help
was more useful for users than in-time help, (4) users wanted the OS supported
screen reader function to be turned off during the color reading, and (5) users
required tactile feedback to identify touch screen boundary. Keywords: Accessibility; User Requirements; Camera-Based Mobile Applications; Visual
Impairment | |||
| A Route Planner Interpretation Service for Hard of Hearing People | | BIBAK | Full-Text | 52-58 | |
| Mehrez Boulares; Mohamed Jemni | |||
| The advancement of technology over the past fifteen years has opened many
new doors to make our daily life easier. Nowadays, smart phones provide many
services such as everywhere access to the social networks, video communication
through 3G networks and the GPS (global positioning system) service. For
instance, using GPS technology and Google maps services; user can find a route
planner for traveling by foot, car, bike or public transport. Google map is
based on KML which contains textual information to describe streets or places
name and this is not accessible to persons with special needs like hard of
hearing people. However, hearing impairment persons have very specific needs
related to the learning and understanding process of any written language.
Consequently, this service is not accessible to them. In this paper we propose
a new approach that makes accessible KML information on android mobile devices.
We rely on cloud computing and virtual agent technology subtitled with
SignWriting to interpret automatically textual information on the map according
to the user current position. Keywords: Android; SignWriting; Cloud Computing; Virtual Agent; Google map; GPS | |||
| Translating Floor Plans into Directions | | BIBAK | Full-Text | 59-66 | |
| Martin Spindler; Michael Weber; Denise Prescher; Mei Miao; Gerhard Weber; et al | |||
| Project Mobility supports blind and low-vision people in exploring and
wayfinding indoors. Facility operators are enabled to annotate floor plans to
provide accessible content. An accessible smartphone app is developed for
presenting spatial information and directions on the go, regarding the user's
position. This paper describes some of the main goals and features of the
system and the results of first user tests we conducted at a large airport. Keywords: accessibility; blind; low-vision; indoor; localization; directions; floor
plans; annotation; mobile; Mobility | |||
| Harnessing Wireless Technologies for Campus Navigation by Blind Students and Visitors | | BIBAK | Full-Text | 67-74 | |
| Tracey J. Mehigan; Ian Pitt | |||
| Navigating around a university campus can be difficult for visitors and
incoming students/staff, and is a particular challenge for vision-impaired
students and staff. University College Cork (UCC), like most other universities
and similar institutions worldwide, relies mainly on sign-posts and maps
(available from the college website) to direct students and visitors around
campus. However, these are not appropriate for vision-impaired users. UCC's
Disability Support Service provides mobility training to enable blind and
vision-impaired students and staff to safely and independently navigate around
the campus. This training is time-consuming for all parties and is costly to
provide. It is also route-specific: for example, if a blind student who has
already received mobility training is required to attend lectures in a building
they have not previously visited, they may require further training on the new
route. It is not feasible to provide this kind of training for
blind/visually-impaired visitors. A potential solution to these problems is to
provide navigation data using wireless and mobile technology. Ideally this
should be done using technologies that are (or will shortly be) widely
supported on smart-phones, thus ensuring that the system is accessible to
one-time visitors as well as regular users.
A study was conducted in order to identify user-requirements. It was concluded that there is no off-the-shelf system that fully meets UCC's requirements. Most of the candidates fall short either in terms of the accuracy or reliability of the localization information provided, ability to operate both indoors and outdoors, or in the nature of the feedback provided. In the light of these findings, a prototype system has been developed for use on the UCC campus. This paper describes the development of the system and ongoing user-testing to assess the viability of the interface for use by vision-impaired people. Keywords: Accessibility; Navigation; HCI & Non-Classical Interfaces; Design for
All; User Centered Design and User Involvement | |||
| Eyesight Sharing in Blind Grocery Shopping: Remote P2P Caregiving through Cloud Computing | | BIBA | Full-Text | 75-82 | |
| Vladimir Kulyukin; Tanwir Zaman; Abhishek Andhavarapu; Aliasgar Kutiyanawala | |||
| Product recognition continues to be a major access barrier for visually impaired (VI) and blind individuals in modern supermarkets. R&D approaches to this problem in the assistive technology (AT) literature vary from automated vision-based solutions to crowdsourcing applications where VI clients send image identification requests to web services. The former struggle with run-time failures and scalability while the latter must cope with concerns about trust, privacy, and quality of service. In this paper, we investigate a mobile cloud computing framework for remote caregiving that may help VI and blind clients with product recognition in supermarkets. This framework emphasizes remote teleassistance and assumes that clients work with dedicated caregivers (helpers). Clients tap on their smartphones' touchscreens to send images of products they examine to the cloud where the SURF algorithm matches incoming image against its image database. Images along with the names of the top 5 matches are sent to remote sighted helpers via push notification services. A helper confirms the product's name, if it is in the top 5 matches, or speaks or types the product's name, if it is not. Basic quality of service is ensured through human eyesight sharing even when image matching does not work well. We implemented this framework in a module called EyeShare on two Android 2.3.3/2.3.6 smartphones. EyeShare was tested in three experiments with one blindfolded subject: one lab study and two experiments in Fresh Market, a supermarket in Logan, Utah. The results of our experiments show that the proposed framework may be used as a product identification solution in supermarkets. | |||
| Assessment Test Framework for Collecting and Evaluating Fall-Related Data Using Mobile Devices | | BIBAK | Full-Text | 83-90 | |
| Stefan Almer; Josef Kolbitsch; Johannes Oberzaucher; Martin Ebner | |||
| With an increasing population of older people the number of falls and
fall-related injuries is on the rise. This will cause changes for future health
care systems, and fall prevention and fall detection will pose a major
challenge. Taking the multimodal character of fall-related parameters into
account, the development of adequate strategies for fall prevention and
detection is very complex. Therefore, it is necessary to collect and analyze
fall-related data.
This paper describes the development of a test framework to perform a variety of assessment tests to collect fall-related data. The aim of the framework is to easily set up assessment tests and analyze the data regarding fall-related behaviors. It offers an open interface to support a variety of devices. The framework consists of a Web service, a relational database and a Web-based backend. In order to test the framework, a mobile device client recording accelerometer and gyroscope sensor data is implemented on the iOS platform. The evaluation, which includes three mobility assessment tests, demonstrates the sensor accuracy for movement analysis for further feature extraction. Keywords: fall detection; fall prevention; mobile devices; restful Web service | |||
| NAVCOM -- WLAN Communication between Public Transport Vehicles and Smart Phones to Support Visually Impaired and Blind People | | BIBAK | Full-Text | 91-98 | |
| Werner Bischof; Elmar Krajnc; Markus Dornhofer; Michael Ulm | |||
| Visually impaired and blind people want to move or travel on their own but
they depend on public transport systems. This is sometimes challenging. Some
problems are to find the right vehicle, signalling their wish to enter or leave
the vehicle and getting information of the upcoming stations. To solve these
problem very specialized equipment was develop. In this paper we show a
solution with standard WLAN components and a standard smart phone, that might
solve these problems. Hopefully this raises the life quality for the people
with special needs. Keywords: Mobility; accessibility; vehicle communication; WLAN; smart phone; IBIS;
public transport | |||
| Mobile-Type Remote Captioning System for Deaf or Hard-of-Hearing People and the Experience of Remote Supports after the Great East Japan Earthquake | | BIBAK | Full-Text | 99-104 | |
| Shigeki Miyoshi; Sumihiro Kawano; Mayumi Shirasawa; Kyoko Isoda; Michiko Hasuike; et al | |||
| Mobile-type Remote Captioning System which we proposed is to realize
advanced support for sightseeing tours using an uttering guide or practical
field trip outside of class. Our system utilizes the mobile phone network
provided by Japanese mobile phone carriers, the monthly flat-rate voice call
and data transfer services. By using these services, deaf or hard-of-hearing
student could use real-time captioning while walking. On March 11, 2011, the
Great East Japan Earthquake shook Japan. After the quake, there was a great
lack of the volunteer students (captionists) inside of the affection areas.
Universities outside of the affection areas supported remotely to cover the
volunteer work. In order to realize such remote support, the system reported by
this paper was used. Keywords: Remote Caption; Real-time; Deaf or Hard-of-Hearing; the Great East Japan
Earthquake | |||
| Handheld "App" Offering Visual Support to Students with Autism Spectrum Disorders (ASDs) | | BIBAK | Full-Text | 105-112 | |
| Bogdan Zamfir; Robert Tedesco; Brian Reichow | |||
| iPrompts® is a software application for handheld devices that provides
visual support to individuals with Autism Spectrum Disorders (ASDs). Caregivers
use the application to create and present visual schedules, visual countdown
timers, and visual choices, to help individuals with ASDs stay organized,
understand upcoming events, and identify preferences. The developer of the
application, HandHold Adaptive, LLC, initially introduced iPrompts on the
iPhone and iPod Touch in May of 2009. The research team from the Center of
Excellence on Autism Spectrum Disorders at Southern Connecticut State
University conducted a study of iPrompts in 2010, investigating its use by
educators working with students with ASDs. Among other findings, educators
indicated a desire to present visual supports on a larger, "tablet"-sized
display screen, leading the developer to produce an iPad-specific product,
iPrompts® XL. Described in this paper are the research effort of iPrompts
and subsequent development effort of iPrompts XL. Keywords: autism spectrum disorder; ASD; iPad; iPhone; smartphone; tablet; handheld
device; application; app; iPrompts | |||
| Cloud-Based Assistive Speech-Transcription Services | | BIBAK | Full-Text | 113-116 | |
| Zdenek Bumbalek; Jan Zelenka; Lukas Kencl | |||
| Real-time speech transcription is a service of potentially tremendous
positive impact on quality of life of the hearing-impaired. Recent advances in
technologies of mobile networks, cloud services, speech transcription and
mobile clients allowed us to build eScribe, a ubiquitiously available,
cloud-based, speech-transcription service. We present the deployed system,
evaluate the applicability of automated speech recognition using real
measurements and outline a vision of the future enhanced platform,
crowdsourcing human transcribers in social networks. Keywords: Hearing-Impaired; Cloud Computing; Voice Recognition | |||
| Developing a Voice User Interface with Improved Usability for People with Dysarthria | | BIBAK | Full-Text | 117-124 | |
| Yumi Hwang; Daejin Shin; Chang-Yeal Yang; Seung-Yeun Lee; Jin Kim; Byunggoo Kong; et al | |||
| This paper describes the development of a voice user interface (VUI) for
Korean users with dysarthria. The development process, from target application
decisions to prototype system evaluation, focuses on improving the usability of
the interface by reflecting user needs. The first step of development is to
decide target VUI application and its functions. 25 dysarthric participants (5
middle school students and 20 adults) are asked to list the devices they want
to use with a VUI interface and what purposes they would use VUI devices for.
From this user study, SMS sending, web searching and voice dialing on mobile
phones and tablet PCs are decided as the target application and its functions.
The second step is to design the system of the target application in order to
improve usability. 120 people with dysarthria are asked to state the main
problems of currently available VUI devices, and it is found that speech
recognition failure (88%) is the main problem. This result indicates high
speech recognition rate will improve usability. Therefore, to improve the
recognition rate, an isolated word recognition based system with a customizable
command list and a built-in word prediction function is designed for the target
VUI devices. The final step is to develop and evaluate a prototype system. In
this study, a prototype is developed for Apple iOS and Android platform
devices, and then the system design is modified based on the evaluation results
of 5 dysarthric evaluators. Keywords: VUI; Dysarthria; Usability | |||
| Wearable Range-Vibrotactile Field: Design and Evaluation | | BIBA | Full-Text | 125-132 | |
| Frank G. Palmer; Zhigang Zhu; Tony Ro | |||
| Touch is one of the most natural methods of navigation available to the blind. In this paper, we propose a method to enhance a person's use of touch by placing range sensors coupled with vibrators throughout their body. This would allow them to be able to feel objects and obstacles in close proximity to them, without having to physically touch them. In order to make effective use of this vibrotactile approach, it is necessary to discern the perceptual abilities of a person wearing small vibrators on different parts of their body. To do this, we designed a shirt with small vibrators placed on the wrists, elbows, and shoulders, and ran an efficient staircase PEST algorithm to determine their sensitivities on those parts of their body. | |||
| System Supporting Speech Perception in Special Educational Needs Schoolchildren | | BIBAK | Full-Text | 133-136 | |
| Adam Kupryjanow; Piotr Suchomski; Piotr Odya; Andrzej Czyzewski | |||
| The system supporting speech perception during the classes is presented in
the paper. The system is a combination of portable device, which enables
real-time speech stretching, with the workstation designed in order to perform
hearing tests. System was designed to help children suffering from Central
Auditory Processing Disorders. Keywords: special education needs; Central Auditory Processing Disorders; time-scale
modification | |||
| Designing a Mobile Application to Record ABA Data | | BIBAK | Full-Text | 137-144 | |
| Silvia Artoni; Maria Claudia Buzzi; Marina Buzzi; Claudia Fenili; Barbara Leporini; et al | |||
| Applied Behavior Analysis (ABA) is a scientific method for modelling human
behavior, successfully applied in the context of educating autistic subjects.
ABA's scientific approach relies on recording measurable data derived from the
execution of structured programs. In this paper we describe an application
designed to support the work of ABA tutors with autistic subjects.
Specifically, we describe an Android application for gathering data from ABA
sessions with a patient and sharing information among his/her ABA team. Tablets
allow mobility and ease of interaction, enabling efficient data collection and
processing, and automating tasks previously carried out by recording notes on
paper. However, reduced screen size poses challenges for user interface design. Keywords: Autism; ABA; mobile; android; data recording | |||
| Creating Personas with Disabilities | | BIBAK | Full-Text | 145-152 | |
| Trenton Schulz; Kristin Skeide Fuglerud | |||
| Personas can help raise awareness among stakeholders about users' needs.
While personas are made-up people, they are based on facts gathered from user
research. Personas can also be used to raise awareness of universal design and
accessibility needs of people with disabilities. We review the current state of
the art of the personas and review some research and industry projects that use
them. We outline techniques that can be used to create personas with
disabilities. This includes advice on how to get more information about
assistive technology and how to better include people with disabilities in the
persona creation process. We also describe our use of personas with
disabilities in several projects and discuss how it has helped to find
accessibility issues. Keywords: personas; accessibility; universal design; inclusive design | |||
| Eye Controlled Human Computer Interaction for Severely Motor Disabled Children | | BIBAK | Full-Text | 153-156 | |
| Mojca Debeljak; Julija Ocepek; Anton Zupan | |||
| This paper presents two case studies of two children with severe motor
disabilities. After years of no effective feedback from them, an
interdisciplinary approach had been explored with the use of an eye controlled
computer. A multidisciplinary team in clinical environment included a
specialist in physical and rehabilitation medicine, an occupational therapist,
a speech therapist and an engineer. Several applications were tested to
establish feedback from the users, using the only movement they were capable
of: eye movement. Results have shown significant improvement in interaction and
communication for both users. Some differences were present, possibly due to
the age difference. Preparation of content for augmented and alternative
communication is in progress for both users. We realized that awareness of the
existent advanced assistive technology (AT) is crucial for more independent and
qualitative life, from parents or care givers to all AT professionals, working
in clinical environment. Keywords: assistive technology; augmentative and alternative communication; human
computer interaction; eye control; case study | |||
| Gravity Controls for Windows | | BIBAK | Full-Text | 157-163 | |
| Peter Heumader; Klaus Miesenberger; Gerhard Nussbaum | |||
| This paper presents the concept and a prototype of "Gravity Controls".
"Gravity Controls" makes standard Graphical User Interface (GUI) controls
"magnetic" to allow overcoming impacts of motor problems (e.g. tremor of users
leading to unsmooth movements and instable positioning of the cursor (e.g. for
clicking or mouse over events). "Gravity Controls" complements and enhances
standard or Assistive Technology (AT) based interaction with the GUI by
supporting the process of reaching a control and better keeping the position
for interaction. Keywords: assistive technology; software for people with special needs | |||
| Addressing Accessibility Challenges of People with Motor Disabilities by Means of AsTeRICS: A Step by Step Definition of Technical Requirements | | BIBAK | Full-Text | 164-171 | |
| Alvaro García-Soler; Unai Diaz-Orueta; Roland Ossmann; Gerhard Nussbaum; et al | |||
| The need for Assistive Technologies in Europe is leading to the development
of projects which aim is to research and develop technical solutions for people
with long term motor disabilities. The Assistive Technology Rapid Integration
& Construction Set (AsTeRICS) project funded by the 7th Framework Programme
of the EU (Grant Agreement 247730) aims to develop a supporting multiple device
integrated system to help people with upper limbs impairment. To this end,
AsTeRICS is following the User Centred Design methods to gather the user
requirements and develop solutions in an iterative way. This paper reports
requirements prioritization procedures. These procedures are described in order
to illustrate the user requirements transformation into technical requirements
for the system development. Keywords: User Centred Design; motor disabilities; assessment; requirements;
non-classical interfaces | |||
| Indoor and Outdoor Mobility for an Intelligent Autonomous Wheelchair | | BIBA | Full-Text | 172-179 | |
| C. T. Lin; Craig Euler; Po-Jen Wang; Ara Mekhtarian | |||
| A smart wheelchair was developed to provide users with increased independence and flexibility in their lives. The wheelchair can be operated in a fully autonomous mode or a hybrid brain-controlled mode while the continuously running autonomous mode may override the user-generated motion command to avoid potential dangers. The wheelchair's indoor mobility has been demonstrated by operating it in a dynamically occupied hallway, where the smart wheelchair intelligently interacted with pedestrians. An extended operation of the wheelchair for outdoor environments was also explored. Terrain recognition based on visual image processes and multi-layer neural learning network was demonstrated. A mounted Laser Range Finder (LRF) was used to determine terrain drop-offs and steps and to detect stationary and moving obstacles for autonomous path planning. Real-time imaging of the outdoor scenes using the oscillating LRF was attempted; however, the overhead in generating a three-dimensional point cloud exceeded the onboard computer capability. | |||
| Comparing the Accuracy of a P300 Speller for People with Major Physical Disability | | BIBAK | Full-Text | 180-183 | |
| Alexander Lechner; Rupert Ortner; Fabio Aloise; Robert Prückl; Francesca Schettini; et al | |||
| A Brain-Computer Interface (BCI) can provide an additional option for a
person to express himself/herself if he/she suffers a disorder like amyotrophic
lateral sclerosis (ALS), brainstem stroke, brain or spinal cord injury or other
diseases affecting the motor pathway. For a P300 based BCI a matrix of randomly
flashing characters is presented to the participant. To spell a character the
person has to attend to it and to count how many times the character flashes.
The aim of this study was to compare performance achieved by subjects suffering
major motor impairments with that of healthy subjects. The overall accuracy of
the persons with motor impairments reached 70.1% in comparison to 91% obtained
for the group of healthy subjects. When looking at single subjects, one
interesting example shows that under certain circumstances, when the patient
finds difficult to concentrate on one character for a long period of time,
reduce the number of flashes can increase the accuracy. Furthermore, the
influence of several tuning parameters is discussed as it shows that for some
participant's adaptations for achieving valuable spelling results are required.
Finally, exclusion criteria for people who are not able to use the device are
defined. Keywords: brain-computer interface (BCI); Stroke; Amyotrophic Lateral Sclerosis; P300;
Locked-In Syndrome; Visual Evoked Potentials; Spinal Cord Injury | |||
| Application of Robot Suit HAL to Gait Rehabilitation of Stroke Patients: A Case Study | | BIBAK | Full-Text | 184-187 | |
| Kanako Yamawaki; Ryohei Ariyasu; Shigeki Kubota; Hiroaki Kawamoto; Yoshio Nakata; et al | |||
| We have developed the Robot Suit HAL (Hybrid Assistive Limb) to actively
support and enhance human motor functions. The HAL provides physical support
according to the wearer's motion intention. In this paper, we present a case
study of the application of the HAL to gait rehabilitation of a stroke patient.
We applied the HAL to a male patient who suffered a stroke due to cerebral
infarction three years previously. The patient was given walking training with
the HAL twice a week for eight weeks. We evaluated his walking speed (10 m
walking test) and balance ability (using a functional balance scale) before and
after the 8-week rehabilitation with the HAL. The results show an improvement
in the gait and balance ability of a patient with chronic paralysis after gait
training with the HAL, which is a voluntarily controlled rehabilitation device. Keywords: Robot Suit; HAL; Rehabilitation; Locomotor training; Hemiplegia | |||
| Sign 2.0: ICT for Sign Language Users: Information Sharing, Interoperability, User-Centered Design and Collaboration | | BIBAK | Full-Text | 188-191 | |
| Liesbeth Pyfers | |||
| Deaf people have always been early adopters of everything ICT has to offer.
Many barriers however remain, that make it difficult for Deaf sign language
users to use their preferred, and for some: only accessible language, when and
where they want. In this session, some of the current R&D efforts for sign
language users are presented, with the objective to promote information sharing
and collaboration, so that recent threats can be dealt with productively and
converted into opportunities. Keywords: Deaf; sign language; accessibility | |||
| Toward Developing a Very Big Sign Language Parallel Corpus | | BIBAK | Full-Text | 192-199 | |
| Achraf Othman; Zouhour Tmar; Mohamed Jemni | |||
| The Community for researchers in the field of sign language is facing a
serious problem which is the absence of a large parallel corpus for signs
language. The ASLG-PC12 project, conducted in our laboratory, proposes a
rule-based approach for building big parallel corpus between English written
texts and American Sign Language Gloss. In this paper, we present a new
algorithm to transform a part of English-speech sentence to ASL gloss. This
project was started in the beginning of 2011 and it offers today a corpus
containing more than one hundred million pairs of sentences between English and
ASL gloss. It is available online for free in order to develop and design new
algorithms and theories for Sign Language processing, for instance, statistical
machine translation and any related fields. We present, in particular, the
tasks for generating ASL sentences from the corpus Gutenberg Project that
contains only English written texts. Keywords: American Sign Language; Parallel Corpora; Sign Language | |||
| Czech Sign Language -- Czech Dictionary and Thesaurus On-Line | | BIBA | Full-Text | 200-204 | |
| Jan Fikejs; Tomáš Sklenák | |||
| The paper deals with a monolingual (explanatory) and bilingual dictionary of the spoken and sign language, which in each of the languages provides grammatical, stylistic and semantic characteristics, contextual quotes, information about hyponyms and hypernyms, transcription, and audio/video recording (front-facing and sideview captures). The dictionary also serves as a basic didactic aid for teaching deaf and hearing users and their specialized work with academic texts at Masaryk University (MU) and for this reason it also, besides the basic vocabulary, includes specialized terminology of study programs provided at MU in the Czech sign language. Another aim of this dictionary is to build a centralized on-line dictionary of newly created terminology and the existing vocabulary. | |||
| The Dicta-Sign Wiki: Enabling Web Communication for the Deaf | | BIBAK | Full-Text | 205-212 | |
| Eleni Efthimiou; Stavroula-Evita Fotinea; Thomas Hanke; John Glauert; Richard Bowden; et al | |||
| The paper provides a report on the user-centred showcase prototypes of the
DICTA-SIGN project (http://www.dictasign.eu/), an FP7-ICT project which ended
in January 2012. DICTA-SIGN researched ways to enable communication between
Deaf individuals through the development of human-computer interfaces (HCI) for
Deaf users, by means of Sign Language. Emphasis is placed on the Sign-Wiki
prototype that demonstrates the potential of sign languages to participate in
contemporary Web 2.0 applications where user contributions are editable by an
entire community and sign language users can benefit from collaborative editing
facilities. Keywords: Sign language technologies; sign-Wiki; multilingual sign language resources;
Deaf communication; Deaf user-centred HCI | |||
| Sign Language Multimedia Based Interaction for Aurally Handicapped People | | BIBAK | Full-Text | 213-220 | |
| Matjaz Debevc; Ines Kozuh; Primoz Kosec; Milan Rotovnik; Andreas Holzinger | |||
| People with hearing disabilities still do not have a satisfactory access to
Internet services. Since sign language is the mother tongue of deaf people, and
80% of this social group cannot successfully understand the written content,
different ways of using sign language to deliver information via the Internet
should be considered. In this paper, we provide a technical overview of
solutions to this problem that we have designed and tested in recent years,
along with the evaluation results and users' experience reports. The solutions
discussed prioritize sign language on the Internet for the deaf and hard of
hearing using a multimodal approach to delivering information, including video,
audio and captions. Keywords: deaf; hard of hearing; sign language; VELAP; SLI module; sign language
glossary | |||
| Meeting Support System for the Person with Hearing Impairment Using Tablet Devices and Speech Recognition | | BIBAK | Full-Text | 221-224 | |
| Makoto Kobayashi; Hiroki Minagawa; Tomoyuki Nishioka; Shigeki Miyoshi | |||
| In this paper, we propose a support system for hearing impaired person who
attends a small meeting in which other members are hearing people. In such a
case, to follow a discussion is difficult for him/her. To solve the problem,
the system is designed to show what members are speaking in real time. The
system consists of tablet devices and a PC as a server. The PC equips speech
recognition software and distributes the recognized results to tablets. The
main feature of this system is a method to correct initial speech recognition
results that is considered not to be perfectly recognized. The method is
handwriting over the tablet device written by meeting members themselves, not
by supporting staffs. Every meeting member can correct every recognized result
in any time. By this means, the system has possibility to be low cost hearing
aids because it does not require extra support staffs. Keywords: hearing impaired person; meeting support system; tablet device; speech
recognition; low cost hearing aids | |||
| Dubbing of Videos for Deaf People -- A Sign Language Approach | | BIBAK | Full-Text | 225-228 | |
| Franz Niederl; Petra Bußwald; Georg Tschare; Jürgen Hackl; Josef Philipp | |||
| Deaf people have their own language and they use the sign language to
communicate. Movies are synchronized into a lot of different languages so that
almost everyone is able to understand it, but sign language is always missing.
This project makes a first step to close the gap by developing a "how to
produce sign language based synchronization" guide for movies and a video
player, which plays and shows two different movies at once. Methodical steps
include modelling of sign language movie, conversion between spoken language,
noise, music and sign language, development of a video player, system
architecture for the distribution of the sign language movie and qualitative
and quantitative examination of the approaches with an expert group. Keywords: deaf people; sign language dubbing; accessibility; assistive technology | |||
| Towards a 3D Signing Avatar from SignWriting Notation | | BIBAK | Full-Text | 229-236 | |
| Yosra Bouzid; Maher Jbali; Oussama El Ghoul; Mohamed Jemni | |||
| Many transcription systems, like SignWriting, have been suggested in the
last decades to describe sign language in a written form. But, these systems
have some limitations, they are not easily understood and adopted by the
members of deaf community who usually use video and avatar-based systems to
access information. In this context, we present in this paper a new tool for
automatically generating 3D animation sequences from SW notation. The SW
notation is provided as input in an XML based format called SWML (SignWriting
Markup Language). This tool aims to improve the reading and writing
capabilities of a deaf person who has no special training to read and write in
SL. Keywords: Deaf; Sign Language; SignWriting; SWML; avatar; 3D signing animation | |||
| Sign Language Computer-Aided Education: Exploiting GSL Resources and Technologies for Web Deaf Communication | | BIBAK | Full-Text | 237-244 | |
| Stavroula-Evita Fotinea; Eleni Efthimiou; Athanasia-Lida Dimou | |||
| The paper discusses the potential of exploitation of sign language (SL)
monolingual or multilingual resources in combination with lately developed Web
technologies in order to answer the need for creation of SL educational
content. The reported use case comprises tools and methodologies for creating
educational content for the teaching of Greek Sign Language (GSL), by
exploiting resources, originally created and annotated in order to support sign
recognition and sign synthesis technologies in the framework of the FP7
DICTA-SIGN project, along with a Wiki-like environment that makes possible
creation, modification and presentation of SL content. Keywords: Sign language resources; sign language technologies; processing of sign
language data; sign language educational content; Deaf communication; HCI | |||
| SignMedia | | BIBAK | Full-Text | 245-252 | |
| Luzia Gansinger | |||
| An increasing number of deaf graduates and professionals enter media related
careers. In the media industry it is a common practice to communicate in
written English. Since English discourse can prove a barrier to sign language
users, the interactive learning resource SignMedia teaches written English
through national sign languages. Learners immerse in a virtual media
environment where they perform tasks taken from various stages of the
production process of a TV series to reinforce their English skills at
intermediate level. By offering an accessible English for Specific Purposes
(ESP) course for the media industry, the SignMedia learning tool supports
career progression of deaf media professionals. Keywords: sign language; e-learning; accessibility; ICT; multimedia; EFL/ESL; ESP | |||
| SignAssess -- Online Sign Language Training Assignments via the Browser, Desktop and Mobile | | BIBAK | Full-Text | 253-260 | |
| Christopher John | |||
| SignAssess is a web-based e-learning resource for online sign language
training assignments simultaneously accessible to desktop and mobile
applications. SignAssess was developed to meet the sign language training
industry need for an e-learning standard-based online video assignment solution
compatible with Course Management Systems and not reliant on local user media
recording or storage resources instead to include browser-based media recording
and remote storage of content streamed to users on demand. Keywords: E-learning; Sign language; Course Management System; Sign language
interpreting | |||
| Towards General Cross-Platform CCF Based Multi-modal Language Support | | BIBAK | Full-Text | 261-268 | |
| Mats Lundälv; Sandra Derbring | |||
| The AEGIS project aims to contribute a framework for, and building blocks
for, an infrastructure for "open accessibility everywhere". One of many
objectives has been to research, prototype and test freely available software
services for inclusive graphical symbol support as part of mainstream ICT
environments. Based on the Concept Coding Framework (CCF) technology, a
"CCF-SymbolServer" has been developed. It can be installed locally on any of
the major desktop platforms (GNU/Linux, MacOS X and Windows) to provide its
multilingual and multi-modal representation services, or online to support many
kinds of web services and networked mobile systems. The three current AEGIS
applications will be presented: 1) CCF-SymbolWriter, an extension for symbol
support in LibreOffice/OpenOffice Writer, 2) the new CCF supported version of
Special Access to Windows (SAW6), 3) CCF-SymbolDroid, an AAC app for Android
mobile devices. User evaluations and future perspectives will be discussed. Keywords: AAC; AT; accessibility; graphical symbols; literacy; cognitive impairment;
open-source | |||
| Developing an Augmentative Mobile Communication System | | BIBAK | Full-Text | 269-274 | |
| Juan Bautista Montalvá Colomer; María Fernanda Cabrera-Umpiérrez; et al | |||
| The widespread use of smartphones and the inclusion of new technologies as
Near Field Communications (NFC) in the mobile devices offer a chance to turn
the classic Augmentative and Alternative Communication (AAC) boards into
Hi-Tech AAC systems with lower costs. This paper presents the development of an
augmentative communication system based on Android mobile devices with NFC
technology, named BOARD (Book Of Activities Regardless of Disabilities) that
not only enables direct communication with voice synthesis, but also through
SMS and expands the functionality of AAC systems allowing control of the
smartphone and home appliances, all in a simple way just by bringing the phone
next to the pictogram. Keywords: AAC; mobile phone; NFC; assistive technology; smart home; cerebral palsy;
ALS | |||
| The Korean Web-Based AAC Board Making System | | BIBAK | Full-Text | 275-278 | |
| Saerom Choi; Heeyeon Lee; Ki-Hyung Hong | |||
| The purpose of this study was to develop a Korean web-based customized AAC
board making system that is easily accessible and compatible across devices in
Korean cultural/linguistic contexts. Potential users of this system are
individuals with communication disorders and their parents/teachers. Board
making users can make customized symbol boards using either built-in or
customized symbols. The AAC users can access their own AAC page generated by
personalized AAC page application using any devices only if they can access to
a web-browser. We expect that this system plays a role for Korean AAC users to
generate customized AAC boards on the web and to use the boards in meaningful
environments to meet their unique communication needs. Keywords: Accessibility; Augmentative and Alternative Communication; Board Making;
Web-Based System | |||
| SymbolChat: Picture-Based Communication Platform for Users with Intellectual Disabilities | | BIBAK | Full-Text | 279-286 | |
| Tuuli Keskinen; Tomi Heimonen; Markku Turunen; Juha-Pekka Rajaniemi; Sami Kauppinen | |||
| We introduce a multimodal picture-based communication platform for users
with intellectual disabilities, and results from our user evaluation carried
out with the target user group representatives and their assistants. Our
current prototype is based on touchscreen input and symbol and text-to-speech
output, but supports also mouse and keyboard interaction. The prototype was
evaluated in a field study with the help of nine users with varying degrees of
intellectual and motor disabilities. Based on our findings, the picture-based
approach and our application, SymbolChat, show great potential in providing a
tool for users with intellectual disabilities to communicate with other people
over the Internet, even without prior knowledge of symbols. The findings
highlighted a number of potential improvements to the system, including
providing even more input methods for users with physical disabilities, and
functionality to support the development of younger users, who are still
learning vocabulary and developing their abilities. Keywords: symbol communication; instant messaging; accessibility | |||
| Developing AAC Message Generating Training System Based on Core Vocabulary Approach | | BIBAK | Full-Text | 287-294 | |
| Ming-Chung Chen; Cheng-Chien Chen; Chien-Chuan Ko; Hwa-Pey Wang; Shao-Wun Chen | |||
| Alphabetical-based message generating method is one of the essential
features for an augmentative and alternative communication (AAC) system,
although it is not the most efficient method. For the Mandarin Chinese AAC
users, Chinese text entry is very important if they are expected to say what
they want to say. However, a user is required to assemble specific keys to
generate a Chinese character. Users need to learn a specific text entry method
before they could generate any Chinese character. This study aims to develop a
web-based training system which comprises the core Chinese characters to assist
the individual with disabilities to learn to generate Mandarin Chinese message
more efficiently. This study conducted a usability evaluation to exam the
system. In addition this paper also recruited children with learning
disabilities and mental retardation to test the training system. Keywords: augmentative and alternative communication; Mandarin Chinese; core
vocabulary | |||
| New Features in the VoxAid Communication Aid for Speech Impaired People | | BIBA | Full-Text | 295-302 | |
| Bálint Tóth; Péter Nagy; Géza Németh | |||
| For speech impaired persons even daily communication may cause problems. In many common situations, where speech ability would be necessary, they are not able to hold on. An application that uses Text-To-Speech (TTS) conversion is usable not only in daily routine, but in treatment of speech impaired persons as a therapeutic application as well. The VoxAid framework from BME-TMIT gives solutions for these scenarios. This paper introduces the latest improvements of the Voxaid framework, including user tests and evaluation. | |||
| AAC Vocabulary Standardisation and Harmonisation | | BIBAK | Full-Text | 303-310 | |
| Mats Lundälv; Sandra Derbring | |||
| The Concept Coding Framework (CCF) effort, started in the European WWAAC
project and now continued in the European AEGIS project, as well as the current
vocabulary efforts within BCI (Blissymbolics Communication International),
highlight that issues of AAC vocabulary content, management and
interoperability are central. This paper outlines some stages of this work so
far, including the important role of the Authorised Blissymbol Vocabulary
(BCI-AV) and its relation to resources like the Princeton WordNet lexical
database and the ARASAAC symbol library. The work initiated to link
Blissymbolics and other AAC symbol vocabularies, as well as the CCF concept
ontologies, to the ISO Concept Database (ISO/CDB) and the work of ISO Technical
Committee 37 (ISO TC 37), will be discussed. In this context the long-term
ambition to establish an ISO standardised Unicode font for Blissymbolics will
also be brought to the fore. We'll stress the importance of clarified and, when
possible, harmonised licensing conditions. Keywords: AAC; graphic symbols; vocabulary; multilingual; multi-modal; speech
impairment; language impairment; cognitive impairment; language learning | |||
| Speaking and Understanding Morse Language, Speech Technology and Autism | | BIBA | Full-Text | 311-314 | |
| András Arató; Norbert Markus; Zoltan Juhasz | |||
| The language nature of Morse is discussed, showing similarities and differences with spoken language. The radio amateur club working at the Laboratory of Speech Technology for Rehabilitation was used to educate and investigate behavior of a stuttering autistic boy. Morse codes' (phonemes') perception accuracy was measured changing the speed of the phonemes. A hypothesis is described that the language elements have to be fixed at different speeds for quick recognition. Experiments with a non-speaking autistic girl using tablet PC are also described. | |||
| Reverse-Engineering Scanning Keyboards | | BIBAK | Full-Text | 315-322 | |
| Foad Hamidi; Melanie Baljko | |||
| Scanning or soft keyboards are alternatives to physical computer keyboards
that allow users with motor disabilities to compose text and control the
computer using a small number of input actions. In this paper, we present the
reverse Huffman algorithm (RHA), a novel Information Theoretic method that
extracts a representative latent probability distribution from a given scanning
keyboard design. By calculating the Jensen-Shannon Divergence (JSD) between the
extracted probability distribution and the probability distribution that
represents the body of text that will be composed by the scanning keyboard, the
efficiency of the design can be predicted and designs can be compared with each
other. Thus, using RHS provides a novel a priori context-aware method for
reverse-engineering scanning keyboards. Keywords: Scanning Keyboards; Information Theory; Huffman Algorithm | |||
| A Communication System on Smart Phones and Tablets for Non-verbal Children with Autism | | BIBA | Full-Text | 323-330 | |
| Harini Sampath; Bipin Indurkhya; Jayanthi Sivaswamy | |||
| We designed, developed and evaluated an Augmentative and Alternative Communication (AAC) system, AutVisComm, for children with autism that can run on smart phones and tablets. An iterative design and development process was followed, where the prototypes were developed in close collaboration with the user group, and the usability testing was gradually expanded to larger groups. In the last evaluation stage described here, twenty-four children with autism used AutVisComm to learn to request the desired object. We measured their learning rates and correlated them with their behavior traits (as observed by their teachers) like joint attention, symbolic processing and imitation. We found that their ability for symbolic processing did not correlate with the learning rate, but their ability for joint attention did. This suggests that this system (and this class of AACs) helps to compensate for a lack of symbolic processing, but not for a lack of joint-attention mechanism. | |||
| Assessment of Biosignals for Managing a Virtual Keyboard | | BIBAK | Full-Text | 331-337 | |
| Manuel Merino; Isabel Gómez; Alberto J. Molina; Kevin Guzman | |||
| In this paper we propose an assessment of biosignals for handling an
application based on virtual keyboard and automatic scanning. The aim of this
work is to measure the effect of using such application, through different
interfaces based on electromyography and electrooculography, on cardiac and
electrodermal activities. Five people without disabilities have been tested.
Each subject wrote twice the same text using an electromyography interface in
first test and electrooculography in the second one. Each test was divided into
four parts: instruction, initial relax, writing and final relax. The results of
the tests show important differences in the electrocardiogram and electrodermal
activity among the parts of tests. Keywords: affective interfaces; biosignals; control system; disability | |||
| Applying the Principles of Experience-Dependent Neural Plasticity: Building up Language Abilities with ELA®-Computerized Language Modules | | BIBAK | Full-Text | 338-345 | |
| Jacqueline Stark; Christiane Pons; Ronald Bruckner; Beate Fessl; Rebecca Janker; Verena Leitner; Karin Mittermann; Michaela Rausch | |||
| In this paper, a computerized language therapy program that aims at
supplying the required dose of practice for PWAs will be presented, namely the
ELA®-Language Modules. The rationale and underlying principles for each
linguistic level and the linguistic structure of the language tasks for the
word, sentence and text level and for dialogues will be explained and how the
components of the ELA®-Language Modules adhere to the principles of
experience-dependent neural plasticity. First pilot applications of the
ELA®-Language Modules with PWAs are discussed in terms of the principles of
experience-dependent neural plasticity and usability. Keywords: Aphasia; computerized language training; neural plasticity | |||
| Assistive Technology: Writing Tool to Support Students with Learning Disabilities | | BIBAK | Full-Text | 346-352 | |
| Onintra Poobrasert; Alongkorn Wongteeratana | |||
| Previous studies show that assistive technology has a significant impact on
helping students with disabilities achieve their academic goals. Assistive
technology is hardware, devices and software equipment that help students with
disabilities by giving them the same access to perform certain tasks that would
otherwise have been challenging. Selecting an appropriate AT tool for a student
requires parents, educators, and other professionals take a comprehensive view,
carefully analyzing the interaction between the student, the technology, the
tasks to be performed, and the settings where it will be used. Therefore, this
study was conducted in order to confirm the effective use of assistive
technology such as Thai Word Search. The results reflected an improvement in
student achievement and appeared to make a greater contribution toward student
in this study when using assistive technology to support writing. Keywords: Assistive technology; Dysgraphia; Learning disabilities; Single subject
design; Writing tool | |||
| Communication Access for a Student with Multiple Disabilities: An Interdisciplinary Collaborative Approach | | BIBAK | Full-Text | 353-360 | |
| Frances Layman; Cathryn Crowle; John Ravenscroft | |||
| This case study highlights the challenges and outcomes of implementing
assistive technology for a 17 year old school student with a profound hearing
loss, and significant physical disabilities. It demonstrates the importance of
a collaborative team approach and the benefits for the student of using
assistive technology with regards to the development of self determination and
social relationships. This article is of benefit for inter-professional teams
working in special education, particularly with students with multiple
disabilities. Keywords: Accessibility; Assistive Technology; Augmented and Alternative Communication
(AAC); Design for All; and User Involvement | |||
| Multimedia Advocacy | | BIBAK | Full-Text | 361-368 | |
| Gosia Kwiatkowska; Thomas Tröbinger; Karl Bäck; Peter Williams | |||
| The paper describes the early stages of one strand of an international
project entitled Web 2.0 for People with Intellectual Disabilities (W2ID). The
project team reports on a project pilot that involves five countries, 400
learners with Intellectual Disabilities (ID) (13 to adulthood), their teachers
and supporters, developing rich media web content using Multimedia Self
Advocacy Approach and the specially designed 'Klik in' platform.
The 'Klik in' Web2.0 platform was designed to enable people with ID to express their views and preferences using pictures, videos, sounds and text and to share these with their peers and supporters. Easy-to-use learning materials and a standardised pedagogic approach were also developed to assist learners and supporters throughout the project. The project is being monitored and evaluated using mostly quantitative instruments, although some qualitative data is also being collected and will inform final findings. The early results indicate that learners with ID are motivated to work with rich media content and the web 2.0 'Klik in' platform and are able to express their right to self advocacy. Keywords: self advocacy; multimedia; digital inclusion; intellectual disability; web
2.0; klik in; website; internet; rich media content | |||
| How Long Is a Short Sentence? -- A Linguistic Approach to Definition and Validation of Rules for Easy-to-Read Material | | BIBA | Full-Text | 369-376 | |
| Annika Nietzio; Birgit Scheer; Christian Bühler | |||
| This paper presents a new approach to empirical validation and verification
of guidelines for easy-to-read material. The goal of our approach is twofold.
One the one hand, the linguistic analysis investigates if the well-known rules
are really applied consistently throughout the published easy-to-read material.
The findings from this study can help define new rules and refine existing
rules.
One the other hand, we show how the software developed for the linguistic analysis can also be used as a tool to support authors in the production of easy-to-read material. The tool applies the rules to the new text and highlights any passages that do not meet those rules, so that the author can go back and improve the text. | |||
| CAPKOM -- Innovative Graphical User Interface Supporting People with Cognitive Disabilities | | BIBAK | Full-Text | 377-384 | |
| Andrea Petz; Nicoleta Radu; Markus Lassnig | |||
| Most research activities on web accessibility focus on people with physical
or sensory disabilities, while potential users with cognitive disabilities
still lack adequate solutions to overcome barriers resulting from their
disability. The innovative graphical user interface to be developed within the
project CAPKOM intends to change this. In a novel approach, this user interface
shall be instantly adaptable to the very different demands of people with
cognitive disabilities. Iterative user tests will feed results into practical
software development, first exemplified by a community art portal for people
with cognitive disability. Keywords: Accessibility; Webdesign; Graphical User Interface; Cognitive Disability | |||
| A Real-Time Sound Recognition System in an Assisted Environment | | BIBA | Full-Text | 385-391 | |
| Héctor Lozano; Inmaculada Hernáez; Javier Camarena; Ibai Díez; Eva Navas | |||
| This article focuses on the development of detection and classification system of environmental sounds in real-time in a typical home for persons with disabilities. Based on the extraction of acoustic characteristics (Mel Frequency Cepstral Coefficients, Zero Crossing Rate, Roll Off Point and Spectral Centroid) and using a probabilistic classifier (Gaussian Mixture Model), preliminary results show an accuracy rate greater than 93% in the detection and 98% in the classification task. | |||
| Gestures Used by Intelligent Wheelchair Users | | BIBAK | Full-Text | 392-398 | |
| Dimitra Anastasiou; Christoph Stahl | |||
| This paper is concerned with the modality of gestures in communication
between an intelligent wheelchair and a human user. Gestures can enable and
facilitate human-robot interaction (HRI) and go beyond familiar pointing
gestures considering also context-related, subtle, implicit gestural and vocal
instructions that can enable a service. Some findings of a user study related
to gestures are presented in this paper; the study took place at the Bremen
Ambient Assisted Living Lab, a 60m2 apartment suitable for the elderly and
people with physical or cognitive impairments. Keywords: assisted living; gestures; intelligent wheelchair; smart home | |||
| Augmented Reality Based Environment Design Support System for Home Renovation | | BIBAK | Full-Text | 399-406 | |
| Yoshiyuki Takahashi; Hiroko Mizumura | |||
| To improve the living environment for elderly persons, home renovation is
performed. A part of home renovations cost is supported by long-term care
insurance in Japan and several tens of problems related to renovation
constructions are reported. They are caused by lack of communication and
knowledge of constructions. We have developed an Augmented Reality environment
design support system for home modifications. Especially it is designed for the
persons that need long-term care. The preliminary experiment has been carried
out and confirmed the functionality of the system. Keywords: Home renovation; Augmented reality; Image processing | |||
| Fall Detection on Embedded Platform Using Kinect and Wireless Accelerometer | | BIBA | Full-Text | 407-414 | |
| Michal Kepski; Bogdan Kwolek | |||
| In this paper we demonstrate how to accomplish reliable fall detection on a low-cost embedded platform. The detection is achieved by a fuzzy inference system using Kinect and a wearable motion-sensing device that consists of accelerometer and gyroscope. The foreground objects are detected using depth images obtained by Kinect, which is able to extract such images in a room that is dark to our eyes. The system has been implemented on the PandaBoard ES and runs in real-time. It permits unobtrusive fall detection as well as preserves privacy of the user. The experimental results indicate high effectiveness of fall detection. | |||
| Controlled Natural Language Sentence Building as a Model for Designing User Interfaces for Rule Editing in Assisted Living Systems -- A User Study | | BIBAK | Full-Text | 415-418 | |
| Henrike Gappa; Gaby Nordbrock; Yehya Mohamad; Jaroslav Pullmann; Carlos A. Velasco | |||
| As part of the web-based services developed within the WebDA-project the
Action Planner was implemented to allow care givers of people with dementia to
support them in accomplishing activities of daily living and counteract
restlessness amongst others. In order to define rules that include a
description of situations indicating e.g. restlessness as well as an action
that should be undertaken in such situations, a user interface was designed
enabling care givers to express these rules in a controlled natural language
setting. Here, rule expressions were offered in preformulated natural sentences
that could be manipulated by changing (pre)selected notions as "daily" in
pop-up menus embedded in the sentences. A user study was conducted with 24 test
participants (12 < 65 years; 12 > 65 years) proofing that this approach
can be understood as intuitive and well usable also for test participants
beyond 65 years of age. Keywords: User interface design; elderly; ambient assisted living; monitoring systems;
natural language usage | |||
| MonAMI Platform in Elderly Household Environment | | BIBAK | Full-Text | 419-422 | |
| Dušan Šimšík; Alena Galajdová; Daniel Siman; Juraj Bujnák; Marianna Andrášová; Marek Novák | |||
| Paper describes how ambient technology platform MonAMI and related ICT
services were adapted into society of Slovakia. MonAMI is European project
focusing on ambient assisted living based on software, human machine interfaces
and hardware. Main aim was to increase autonomy, enhance ICT services for
monitoring purposes for carers and support safety of vulnerable people living
alone. Broader description of architecture, devices, process of installation
and implementation follow. Keywords: ambient technology; interface; open architecture | |||
| Modeling Text Input for Single-Switch Scanning | | BIBAK | Full-Text | 423-430 | |
| I. Scott MacKenzie | |||
| A method and algorithm for modeling single-switch scanning for text input is
presented. The algorithm uses the layout of a scanning keyboard and a corpus in
the form of a word-frequency list to generate codes representing the scan steps
for entering words. Scan steps per character (SPC) is computed as a weighted
average over the entire corpus. SPC is an absolute measure, thus facilitating
comparisons of keyboards. It is revealed that SPC is sensitive to the corpus if
a keyboard includes word prediction. A recommendation for other research using
SPC is to disclose both the algorithm and the corpus. Keywords: Single-switch scanning; text input; models of interaction; scan steps per
character | |||
| DualScribe: A Keyboard Replacement for Those with Friedreich's Ataxia and Related Diseases | | BIBAK | Full-Text | 431-438 | |
| Torsten Felzer; I. Scott MacKenzie; Stephan Rinderknecht | |||
| An alternative text composition method is introduced, comprising a small
special-purpose keyboard as an input device and software to make text entry
fast and easy. The work was inspired by an FA (Friedreich's Ataxia) patient who
asked us to develop a viable computer interaction solution -- taking into
account the specific symptoms induced by his disease. The outcome makes text
entry easier than with the standard keyboard without being slower. It is likely
that the system has general use for anyone with a similar condition, and also
for able-bodied users looking for a small-size keyboard. We present a usability
study with four participants showing the method's effectiveness. Keywords: Human-computer interaction; special-purpose keyboard; word prediction;
ambiguous keyboards; neuromuscular diseases; Friedreich's Ataxia | |||
| Easier Mobile Phone Input Using the JusFone Keyboard | | BIBAK | Full-Text | 439-446 | |
| Oystein Dale; Trenton Schulz | |||
| We present an alternate mobile phone keyboard for inputing text, the JusFone
Keyboard. This keyboard allows people to enter characters by resting their
finger on a the desired key, and rocking to select a specific character. We ran
user tests of the keyboard with 12 seniors comparing it against a touchscreen
keyboard, a phone with large buttons, and an on-screen PC keyboard. The users
found several things to like about the JusFone Keyboard, including comfort and
size of keys and having direct access to characters. Users also had several
suggestions about how to make the keyboard better such as making the text on
the keys bigger and adjusting the spacing between keys. We also conducted a
diary study of a user with reduced hand function who used the JusFone keyboard
on his PC. The results indicate that the keyboard may be of assistance to
persons with reduced hand function. Keywords: User testing; mobile phones; keyboards; input methods | |||
| Automatic Assessment of Dysarthric Speech Intelligibility Based on Selected Phonetic Quality Features | | BIBAK | Full-Text | 447-450 | |
| Myung Jong Kim; Hoirin Kim | |||
| This paper addresses the problem of assessing the speech intelligibility of
patients with dysarthria, which is a motor speech disorder. Dysarthric speech
produces spectral distortion caused by poor articulation. To characterize the
distorted spectral information, several features related to phonetic quality
are extracted. Then, we find the best feature set which not only produces a
small prediction error but also keeps their mutual dependency low. Finally, the
selected features are linearly combined using a multiple regression model.
Evaluation of the proposed method on a database of 94 patients with dysarthria
proves the effectiveness in predicting subjectively rated scores. Keywords: Dysarthria; phonetic quality; speech intelligibility assessment | |||
| Adaptation of AAC to the Context Communication: A Real Improvement for the User Illustration through the VITIPI Word Completion | | BIBAK | Full-Text | 451-458 | |
| Philippe Boissière; Nadine Vigouroux; Mustapha Mojahid; Frédéric Vella | |||
| This paper describes the performance of the VITIPI word completion system
through a text input simulation. The aim of this simulation is to estimate the
impact of the linguistic knowledge base size through two metrics: the
Key-Stroke Ratio (KSR) and the KeyStroke Per Character (KPC). Our study shows
that the performance of a word completion is depending of the percentage of
words not available and the size of the lexicon. Keywords: AAC Word completion system; KSR and KSPC metric; AAC | |||
| Tackling the Acceptability of Freely Optimized Keyboard Layout | | BIBAK | Full-Text | 459-466 | |
| Bruno Merlin; Mathieu Raynal; Heleno Fülber | |||
| Reorganization of a keyboard layout based on linguistic characteristics
would be an efficient way to improve text input speed. However, a new character
layout imposes a learning period that often discourages users. The Quasi-QWERTY
Keyboard aimed at easing a new layout acceptance by limiting the changes. But
this strategy prejudices the long term performance. Instead, we propose a
solution based on the multilayer interface paradigm. The Multilayer Keyboard
enables to progressively converge through a freely optimized layout. It
transform the learning period into a transition period. During this transition
period, user's performance never regresses and progressively improves. Keywords: Input text; multilayer interface; layout; soft keyboard | |||
| Measuring Performance of a Predictive Keyboard Operated by Humming | | BIBAK | Full-Text | 467-474 | |
| Ondrej Polácek; Adam J. Sporka; Zdenek Míkovec | |||
| A number of text entry methods use a predictive completion based on
letter-level n-gram model. In this paper, we investigate on an optimal length
of n-grams stored in such model for a predictive keyboard operated by humming.
In order to find the length, we analyze six different corpora, from which a
model is built by counting number of primitive operations needed to enter a
text. Based on these operations, we provide a formula for estimation of words
per minute (WPM) rate. The model and the analysis results are verified in an
experiment with three experienced users of the keyboard. Keywords: Text Entry Methods; N-Gram Model; Measuring Performance; Non-Verbal Vocal
Interaction | |||
| Dysarthric Speech Recognition Error Correction Using Weighted Finite State Transducers Based on Context-Dependent Pronunciation Variation | | BIBAK | Full-Text | 475-482 | |
| Woo Kyeong Seong; Ji Hun Park; Hong Kook Kim | |||
| In this paper, we propose a dysarthric speech recognition error correction
method based on weighted finite state transducers (WFSTs). First, the proposed
method constructs a context-dependent (CD) confusion matrix by aligning a
recognized word sequence with the corresponding reference sequence at a phoneme
level. However, because the dysarthric speech database is too insufficient to
reflect all combinations of context-dependent phonemes, the CD confusion matrix
can be underestimated. To mitigate this underestimation problem, the CD
confusion matrix is interpolated with a context-independent (CI) confusion
matrix. Finally, WFSTs based on the interpolated CD confusion matrix are built
and integrated with a dictionary and language model transducers in order to
correct speech recognition errors. The effectiveness of the proposed method is
demonstrated by performing speech recognition using the proposed error
correction method incorporated with the CD confusion matrix. It is shown from
the speech recognition experiment that the average word error rate (WER) of a
speech recognition system employing the proposed error correction method with
the CD confusion matrix is relatively reduced by 13.68% and 5.93%, compared to
those of the baseline speech recognition system and the error correction method
with the CI confusion matrix, respectively. Keywords: context-dependent pronunciation variation modeling; dysarthric speech
recognition; weighted finite state transducers; error correction | |||
| Text Entry Competency for Students with Learning Disabilities in Grade 5 to 6 | | BIBAK | Full-Text | 483-489 | |
| Ting-Fang Wu; Ming-Chung Chen | |||
| This study intended to understand the computer text entry skills for
students with learning disabilities in grade 5 to 6. 35 students with learning
disabilities, who received special education services in resource room at
school, and 35 non-disabled students participated in our study. "Mandarin
Chinese Character Entry Training system (MCChEn system)" was used to measure
the students' text entry skills. SPSS19.0 was used to compare the difference in
text entry skills between children with and without learning disabilities. In
addition, the correlations between the abilities of recognition in Chinese
characters, and text entry skills were also explored. The results indicated
that children with learning disabilities perform significantly poorer than
children without disabilities in recognizing Chinese characters orally and in
computer text entry skills. Chinese characters recognition is an important
factor affecting Chinese Character entry skills in children with learning
disabilities. The tool, "Mandarin Chinese Character Entry Training system
(MCChEn system)", we utilized is able to discriminate the computer text entry
skills between children with and without learning disabilities. The results of
this study can provide educators important information about text entry skills
of children with learning disabilities, in order to develop further training
programs. Keywords: text entry competency; learning disabilities | |||
| Vision SenS | | BIBAK | Full-Text | 490-496 | |
| Berenice Machuca Bautista; José Alfredo Padilla Medina; Francisco Javier Sánchez Marín | |||
| The electronic prototype for vision developed in this paper intends to show
that it is possible to build an inexpensive and functional device which serves
to partly compensate the sense of sight for visually impaired individuals
through sensory substitution, by replacing some functions the sense of sight
with functions of the sense of touch, with the proposed prototype, blind users
receive electrical signals in the tips of their fingers generated from the
capture of images objects with a camera and processed on a laptop to extract
visual information. Keywords: Blind people; touch image; perceive objects; camera for touch; visionsens;
sense of sight | |||
| Computer-Aided Design of Tactile Models | | BIBAK | Full-Text | 497-504 | |
| Andreas Reichinger; Moritz Neumüller; Florian Rist; Stefan Maierhofer; et al | |||
| Computer-aided tools offer great potential for the design and production of
tactile models. While many publications focus on the design of essentially
two-dimensional media like raised line drawings or the reproduction of
three-dimensional objects, we intend to broaden this view by introducing a
taxonomy that classifies the full range of conversion possibilities based on
dimensionality. We present an overview of current methods, discuss specific
advantages and difficulties, identify suitable programs and algorithms and
discuss personal experiences from case studies performed in cooperation with
two museums. Keywords: accessibility; design for all; blind people; visually impaired people;
tactile graphics; tactile models; CAD; CAM; 3D scanning | |||
| Three-Dimensional Model Fabricated by Layered Manufacturing for Visually Handicapped Persons to Trace Heart Shape | | BIBAK | Full-Text | 505-508 | |
| Kenji Yamazawa; Yoshinori Teshima; Yasunari Watanabe; Yuji Ikegami; Mamoru Fujiyoshi; et al | |||
| In this study, we fabricated three-dimensional models of the human heart by
stereolithography and powder-layered manufacturing; using these models,
visually handicapped persons could trace the shape of a heart by touching.
Further, we assessed the level of understanding of the visually handicapped
persons about the external structure of the heart and the position of blood
vessels. Experimental results suggest that the heart shape models developed in
this study by layered manufacturing were useful for teaching anatomy to
visually handicapped persons. Keywords: three-dimensional model; layered manufacturing; visually handicapped
persons; heart shape | |||
| Viable Haptic UML for Blind People | | BIBAK | Full-Text | 509-516 | |
| Claudia Loitsch; Gerhard Weber | |||
| We investigate tactile representations and haptic interaction that may
enable blind people to utilize UML diagrams by using an industry standard
editor. In this paper we present a new approach to present tactile UML diagrams
by preserving spatial information on a touch-sensitive tactile display.
Furthermore we present the results of a fundamental evaluation showing that
blind people retain orientation during exploration of tactile diagrams and
which problems are associated with the usage of ideographs. We compared our new
developed representation with the common method blind people utilize sequence
diagrams: non-visually through verbalization. We indicate problems for both
representations. Keywords: accessibility; blind; tactile graphics; UML diagrams; tactile graphics
display; Braille display; screen reader | |||
| Non-visual Presentation of Graphs Using the Novint Falcon | | BIBAK | Full-Text | 517-520 | |
| Reham Alabbadi; Peter Blanchfield; Maria Petridou | |||
| Several technological advances have contributed to providing non-visual
access to information by individuals who have sight impairments. Screen readers
and Braille displays, however, are not the means of choice for conveying
pictorial data such as graphs, maps, and charts. This paper thus proposes the
"Falcon Graph" interface which has been developed to enable visually impaired
individuals to access computer-based visualisation techniques: mainly pie
charts, bar charts, and line graphs. In addition to its interaction with
Microsoft Excel, the interface uses the Novint Falcon as the main force
feedback media to navigate the haptic virtual environment. Initial findings
gathered from testing the interface are also presented. Keywords: Haptic Technology; Force Feedback; Accessible Graphs; Novint Falcon; Visual
Impairment | |||
| Towards a Geographic Information System Facilitating Navigation of Visually Impaired Users | | BIBAK | Full-Text | 521-528 | |
| Slim Kammoun; Marc J. -M. Macé; Bernard Oriola; Christophe Jouffrais | |||
| In this paper, we propose some adaptation to Geographical Information System
(GIS) components used in GPS based navigation system. In our design process, we
adopted a user-centered design approach in collaboration with final users and
Orientation and Mobility (O&M) instructors. A database scheme is presented
to integrate the principal classes proposed by users and O&M instructors.
In addition, some analytical tools are also implemented and integrated in the
GIS. This adapted GIS can improve the guidance process of existing and future
EOAs. A first implementation of an adapted guidance process allowing a better
representation of the surroundings is provided as an illustration of this
adapted GIS. This work is part of the NAVIG system (Navigation Assisted by
Artificial VIsion and GNSS), an assistive device, whose aim is to improve the
Quality of Life of Visually Impaired (VI) persons via increased orientation and
mobility capabilities. Keywords: Geographical Information System; Electronic Orientation Aids; Participatory
design; Assistive technology | |||
| Combination of Map-Supported Particle Filters with Activity Recognition for Blind Navigation | | BIBAK | Full-Text | 529-535 | |
| Bernhard Schmitz; Attila Györkös; Thomas Ertl | |||
| By implementing a combination of an activity recognition with a
map-supported particle filter we were able to significantly improve the
positioning of our navigation system for blind people. The activity recognition
recognizes walking forward or backward, or ascending or descending stairs. This
knowledge is combined with knowledge from the maps, i.e. the location of
stairs. Different implementations of the particle filter were evaluated
regarding their ability to compensate for sensor drift. Keywords: Pedestrian Navigation; Indoor Navigation; Activity Recognition; Particle
Filter | |||
| AccessibleMap | | BIBAK | Full-Text | 536-543 | |
| Höckner Klaus; Daniele Marano; Julia Neuschmid; Manfred Schrenk; Wolfgang Wasserburger | |||
| Today cities can be discovered easily with the help of web-based maps. They
assist to discover streets, squares and districts by supporting orientation,
mobility and feeling of safety. Nevertheless do online maps still belong to
those elements of the web which are hardly or even not accessible for partially
sighted people. Therefore the main objective of the AccessibleMap project is to
develop methods to design web-based city maps in a way that they can be better
used by people affected with limited sight or blindness in several application
areas of daily life. Keywords: Accessible Maps; Semantic description of maps; Web Map Services; Styled
Layer Description | |||
| Design and User Satisfaction of Interactive Maps for Visually Impaired People | | BIBAK | Full-Text | 544-551 | |
| Anke Brock; Philippe Truillet; Bernard Oriola; Delphine Picard; Christophe Jouffrais | |||
| Multimodal interactive maps are a solution for presenting spatial
information to visually impaired people. In this paper, we present an
interactive multimodal map prototype that is based on a tactile paper map, a
multi-touch screen and audio output. We first describe the different steps for
designing an interactive map: drawing and printing the tactile paper map,
choice of multi-touch technology, interaction technologies and the software
architecture. Then we describe the method used to assess user satisfaction. We
provide data showing that an interactive map -- although based on a unique,
elementary, double tap interaction -- has been met with a high level of user
satisfaction. Interestingly, satisfaction is independent of a user's age,
previous visual experience or Braille experience. This prototype will be used
as a platform to design advanced interactions for spatial learning. Keywords: blind; visual impairment; accessibility; interactive map; tactile map;
multi-touch; satisfaction; SUS; usability | |||
| A Mobile Application Concept to Encourage Independent Mobility for Blind and Visually Impaired Students | | BIBAK | Full-Text | 552-559 | |
| Jukka Liimatainen; Markku Häkkinen; Tuula Nousiainen; Marja Kankaanranta; et al | |||
| This paper presents a user-centric application development process for
mobile application to blind and visually impaired students. The development
process connects the assistive technology experts, teachers and students from
the school for visually impaired together to participate to the design of the
mobile application. The data for the analysis is gathered from interviews and
workshops with the target group. The main goal of the project is to examine how
mobile application can be used to encourage and motivate visually impaired
students to move independently indoors and outdoors. The application allows the
students to interact with their environment through use of sensor technology
now standard on most smart and feature phones. We present a user-centric
application development process, report on findings from the initial user
trials, and propose a framework for future phases of the project. Keywords: Mobile phone application; interactive technologies; blind; visually
impaired; accessibility | |||
| Do-It-Yourself Object Identification Using Augmented Reality for Visually Impaired People | | BIBAK | Full-Text | 560-565 | |
| Atheer S. Al-Khalifa; Hend S. Al-Khalifa | |||
| In this paper, we present a Do-It-Yourself (DIY) application for helping
Visually Impaired People (VIP) identify objects in their day-to-day interaction
with the environment. The application uses the Layar™ Augmented Reality
(AR) API to build a working prototype for identifying grocery items. The
initial results of using the application show positive acceptance from the VIP
community. Keywords: Augmented Reality; Visually Impaired; Object Identification; Layar™ ;
Assistive Technology | |||
| An Assistive Vision System for the Blind That Helps Find Lost Things | | BIBAK | Full-Text | 566-572 | |
| Boris Schauerte; Manel Martinez; Angela Constantinescu; Rainer Stiefelhagen | |||
| We present a computer vision system that helps blind people find lost
objects. To this end, we combine color- and SIFT-based object detection with
sonification to guide the hand of the user towards potential target object
locations. This way, we are able to guide the user's attention and effectively
reduce the space in the environment that needs to be explored. We verified the
suitability of the proposed system in a user study. Keywords: Lost & Found; Computer Vision; Sonification; Object Detection &
Recognition; Visually Impaired; Blind | |||
| Designing a Virtual Environment to Evaluate Multimodal Sensors for Assisting the Visually Impaired | | BIBA | Full-Text | 573-580 | |
| Wai L. Khoo; Eric L. Seidel; Zhigang Zhu | |||
| We describe how to design a virtual environment using Microsoft Robotics Developer Studio in order to evaluate multimodal sensors for assisting visually impaired people in daily tasks such as navigation and orientation. The work focuses on the design of the interfaces of sensors and stimulators in the virtual environment for future subject experimentation. We discuss what type of sensors we have simulated and define some non-classical interfaces to interact with the environment and get feedback from it. We also present preliminary results for feasibility by showing experimental results on volunteer test subjects, concluding with a discussion of potential future directions. | |||
| A Segmentation-Based Stereovision Approach for Assisting Visually Impaired People | | BIBA | Full-Text | 581-587 | |
| Hao Tang; Zhigang Zhu | |||
| An accurate 3D map, automatically generated in real-time from a camera-based stereovision system, is able to assist blind or visually impaired people to obtain correct perception and recognition of the surrounding objects and environment so that they can move safely. In this paper, a segmentation-based stereovision approach is proposed to rapidly obtain accurate 3D estimations of man-made scenes, both indoor and outdoor, with largely textureless areas and sharp depth changes. The new approach takes advantage of the fact that many man-made objects in an urban environment consist of planar surfaces. The final outcome of the system is not just an array of individual 3D points. Instead, the 3D model is built in a geometric representation of plane parameters, with geometric relations among different planar surfaces. Based on this 3D model, algorithms can be developed for traversable path planning, obstacle detection and object recognition for assisting the blind in urban navigation. | |||
| KinDectect: Kinect Detecting Objects | | BIBA | Full-Text | 588-595 | |
| Atif Khan; Febin Moideen; Juan Lopez; Wai L. Khoo; Zhigang Zhu | |||
| Detecting humans and objects in images has been a very challenging problem due to variation in illumination, pose, clothing, background and other complexities. Depth information is an important cue when humans recognize objects and other humans. In this work we utilize the depth information that a Kinect sensor -- Xtion Pro Live provides to detect humans and obstacles in real time for a blind or visually impaired user. The system runs in two modes. For the first mode, we focus on how to track and/or detect multiple humans and moving objects and transduce the information to the user. For the second mode, we present a novel approach on how to avoid obstacles for safe navigation for a blind or visually-impaired user in an indoor environment. In addition, we present a user study with some blind-folded users to measure the efficiency and robustness of our algorithms and approaches. | |||
| A System Helping the Blind to Get Merchandise Information | | BIBA | Full-Text | 596-598 | |
| Nobuhito Tanaka; Yasunori Doi; Tetsuya Matsumoto; Yoshinori Takeuchi; Hiroaki Kudo; et al | |||
| We propose a system helping the blind to get best-before/use-by date of perishable foods. The system consists of a computer, a wireless camera and an earphone. It processes images captured by a user and extracts character regions in the image by using Support Vector Machine (SVM). Processing the regions by Optical Character Recognition (OCR) and the system outputs the best-before/use-by date as synthesized speech. | |||
| Accessibility for the Blind on an Open-Source Mobile Platform | | BIBAK | Full-Text | 599-606 | |
| Norbert Markus; Szabolcs Malik; Zoltan Juhasz; András Arató | |||
| As Android handsets keep flooding the shops in a wide range of prices and
capabilities, many of the blind community turn their attention to this emerging
alternative, especially because of a plethora of cheaper models offered.
Earlier, accessibility experts only recommended Android phones sporting an
inbuilt QWERTY keyboard, as the touch-screen support had then been in an
embryotic state. Since late 2011, with Android 4.X (ICS), this has changed.
However, most handsets on the market today -- especially the cheaper ones --
ship with a pre-ICS Android version. This means that their visually impaired
users won't be able to enjoy the latest accessibility innovations. Porting
MObile SlateTalker to Android has been aimed at filling this accessibility gap
with a low-cost solution, regarding the special needs of our target audience:
the elderly, persons with minimal tech skills and active Braille users. Keywords: (e)Accessibility; Blind People; Assistive Technology; Braille; Usability and
Ergonomics; (e)Aging and Gerontechnology; Mobility; Android | |||
| Accessibility of Android-Based Mobile Devices: A Prototype to Investigate Interaction with Blind Users | | BIBAK | Full-Text | 607-614 | |
| Sarah Chiti; Barbara Leporini | |||
| The study presented in this paper is part of mobile accessibility research
with particular reference to the interaction with touch-screen based
smartphones. Its aim was to gather information, tips and indications on
interaction with a touch-screen by blind users. To this end we designed and
developed a prototype for an Android-based platform. Four blind users (two
inexperienced and two with experience of smartphones) were involved from the
early phase of prototype design. The involvement of inexperienced users played
a key role in understanding expectations of smart phones especially concerning
touch-screen interaction. Skilled users provided useful suggestions on crucial
aspects such as gestures and button position. Although the prototype developed
is limited to only a few features for the Android operating system, the results
obtained from blind user interaction can be generalized and applied to any
mobile device based on a touch-screen. Thus, the results of this work could be
useful to developers of mobile operating systems and applications based on a
touch-screen, in addition to those working on designing and developing
assistive technologies. Keywords: mobile accessibility; touch-screen; blind users | |||
| TypeInBraille: Quick Eyes-Free Typing on Smartphones | | BIBA | Full-Text | 615-622 | |
| Sergio Mascetti; Cristian Bernareggi; Matteo Belotti | |||
| In recent years, smartphones (e.g., Apple iPhone) are getting more and more widespread among visually impaired people. Indeed, thanks to natively available screen readers (e.g., VoiceOver) visually impaired persons can access most of the smartphone functionalities and applications. Nonetheless, there are still some operations that require long time or high mental workload to be completed by a visually impaired person. In particular, typing on the on-screen QWERTY keyboard turns out to be challenging in many typical contexts of use of mobile devices (e.g., while moving on a tramcar). In this paper we present the results of an experimental evaluation conducted with visually impaired people to compare the native iPhone on-screen QWERTY keyboard with TypeInBraille, a recently proposed typing technique based on Braille. The experimental evaluation, conducted in different contexts of use, highlights that TypeInBraille significantly improves typing efficiency and accuracy. | |||
| Real-Time Display Recognition System for Visually Impaired | | BIBAK | Full-Text | 623-629 | |
| Irati Rasines; Pedro Iriondo; Ibai Díez | |||
| Currently, electronic devices incorporating displays to present the
information to the user are ubiquitous common, and visually impaired people
might have problems to use these devices. This article focuses on developing a
real time display detector and digital character recognition application using
techniques based on the connected components approach. The display zone
detection accuracy rate is about 85% and the recognition rate greater than 88%.
The system was implemented on both a desktop and a cell phone. Keywords: Display; OCR; Connected Components; Android; Real-time | |||
| A Non-visual Interface for Tasks Requiring Rapid Recognition and Response | | BIBAK | Full-Text | 630-635 | |
| Kazunori Minatani; Tetsuya Watanabe | |||
| To implement a user interface for blind people, auditory and tactile outputs
have mainly been used. However, an auditory interface is ineffective for tasks
that require the rapid recognition that vision enables. Thus, this paper
presents a method to achieve rapid recognition with a non-visual user
interface. This user interface is implemented to achieve a prototype fully
controllable system of an RC helicopter for blind people by using a braille
display as a tactile output device. This paper also explains the system
integration software, named brl-drone, and hardware components of that system
including the AR. Drone. The AR. Drone is a controlled helicopter that uses an
auxiliary magnetic sensor and a game controller to solve the problems that
arise when a braille display is used as a tactile indicating device. Keywords: Blind People; Non-visual Interface; RC Helicopter; Braille Display; AR.
Drone | |||
| Reaching to Sound Accuracy in the Peri-personal Space of Blind and Sighted Humans | | BIBAK | Full-Text | 636-643 | |
| Marc J. -M. Macé; Florian Dramas; Christophe Jouffrais | |||
| With the aim of designing an assistive device for the Blind, we compared the
ability of blind and sighted subjects to accurately locate several types of
sounds generated in the peri-personal space. Despite a putative lack of
calibration of their auditory system with vision, blind subjects performed with
a similar accuracy as sighted subjects. The average error was sufficiently low
(10° in azimuth and 10 cm in distance) to orient a user towards a specific
goal or to guide a hand grasping movement to a nearby object. Repeated white
noise bursts of short duration induced better performance than continuous
sounds of similar total duration. These types of sound could be advantageously
used in an assistive device. They would provide indications about direction to
follow or position of surrounding objects, with limited masking of
environmental sounds, which are of primary importance for the Blind. Keywords: Sound localization; Blindness; assistive device; augmented reality | |||
| Hapto-acoustic Scene Representation | | BIBAK | Full-Text | 644-650 | |
| Sebastian Ritterbusch; Angela Constantinescu; Volker Koch | |||
| The use of the Phantom Omni force feedback device combined with sonification
is evaluated in applications for visually impaired people such as medical
engineering, numerical simulation, and architectural planning. Keywords: haptic; acoustic; force feedback; sonification; visually impaired | |||
| Efficient Access to PC Applications by Using a Braille Display with Active Tactile Control (ATC) | | BIBAK | Full-Text | 651-658 | |
| Siegfried Kipke | |||
| Braille displays are providing tactile access to information shown on a
screen. The invention of Active Tactile Control (ATC) allows detecting the
tactile reading position on a Braille display in real time. Based on ATC new
computer interactions have been implemented. Braille frames allow the
simultaneous display of various independent sources of information on a Braille
display and are used to improve access to complex applications. A task-overview
for handling of multiple tasks with direct access to the activated tasks
triggered by the reading position has been implemented. A tactile notification
of a spelling mistake triggered by the tactile reading position provides blind
users assistance when editing text. A new rule set for blind users' PC
interaction based on Active Tactile Control needs to be defined. Keywords: tactile reading position; Braille; computer access; Braille frames;
assistance; computer interaction; e-learning; blind PC users | |||
| Applications of Optically Actuated Haptic Elements | | BIBAK | Full-Text | 659-663 | |
| Branislav Mamojka; Peter Teplický | |||
| There are missing commercially available large area dynamic tactile displays
providing access to high-resolution graphic and Braille for the blind people.
This is not solved by currently available displays in the form of a Braille
line. The objective of the project NOMS (Nano-Optical Mechanical Systems) is to
solve this problem by using optically activated haptic actuators. These will
require no hard to assembly moving mechanical parts and have the potential for
finer resolution. Recently developed carbon nanotube enriched photoactive
polymers provided the starting technology for this purpose. There will be
presented development of materials of this kind and their integration into
tactile displays. Keywords: Braille; display; tactile; haptic; photo actuation | |||
| Trackable Interactive Multimodal Manipulatives: Towards a Tangible User Environment for the Blind | | BIBAK | Full-Text | 664-671 | |
| Muhanad S. Manshad; Enrico Pontelli; Shakir J. Manshad | |||
| This paper presents the development of Trackable Interactive Multi-modal
Manipulatives (TIMM). This system provides a multimodal tangible user
environment (TUE), enabling people with visual impairments to create, modify
and naturally interact with graphical representations on a multitouch surface.
The system supports a novel notion of active position, proximity, stacking, and
orientation tracking of manipulatives. The platform has been developed and it
is undergoing formal evaluation. Keywords: Haptic Feedback; Graphing; Accessibility; Blind and Visually Impaired;
Multitouch; Multimodal; TUI; Diagrams; Tangible User Environment (TUE); NASA
TLX; Subjective Workload; Manipulatives; Fiducials; Markers | |||
| Introduction of New Body-Braille Devices and Applications | | BIBAK | Full-Text | 672-675 | |
| Satoshi Ohtsuka; Nobuyuki Sasaki; Sadao Hasegawa; Tetsumi Harakawa | |||
| In this paper, two new Body-Braille devices are described. After the
Body-Braille system and its current development status is explained, first, a
new device for Braille-based real-time communication over internet (via Skype)
is introduced and second, a new device for autonomous learning, which adopts
wireless communication, is explained. The former is already developed and being
used in the field test stage; the latter one is being developed now. Keywords: Body-Braille; vibration; Helen Keller phone; autonomous learning; visually
impaired; deaf-blind | |||