| Gait Analysis Management and Diagnosis in a Prototype Virtual Reality Environment | | BIBA | Full-Text | 3-11 | |
| Salsabeel F. M. Alfalah; David K. Harrison; Vassilis Charissis | |||
| Current medical data derived from gait analysis and diagnosis of various musculoskeletal pathologies offer a plethora of text based and imaging data. The large volume and complexity of the particular data present a number of issues during the collection, storage, searching and visualisation process for gait analysis management and diagnosis. Adhering to above it is evident that a simplified, holistic and user-friendly system is required in order to improve the acquisition and comparison of medical data in a timely manner. Further consultation with health professionals suggested that the proposed prototype should entail an automated system that can extract, save and visualise the data from different sources, in order to enhance medical data visualisation, increase efficiency and thus improve quality of service and management. This work presents the development stages of a new prototype system for managing medical data for gait analysis, which additionally offer simulation capacity in a Virtual Reality environment in order to assist the medical practitioners towards a faster an better informed evaluation of each condition. In particular this paper investigates various methods of displaying medical data in a single application with a view to managing and sharing multimedia data and to employing a VR to enhance user interaction with medical data. Findings of a promising preliminary evaluation through user trials are also presented. Concluding, the paper presents future plans to incorporate a bespoke 3D human-computer interface with a view to provide the health professionals with customisable information and enhancing the interface functionalities. Finally, as the system is web-based there is scope for expansion of the application to other areas of medical assessments involving complicated datasets. | |||
| Theory-Guided Virtual Reality Psychotherapies: Going beyond CBT-Based Approaches | | BIBAK | Full-Text | 12-21 | |
| Sheryl Brahnam | |||
| Most VR applications in mental health care have focused on cognitive
behavioral therapy. This paper is a call to expand research into other
theory-guided psychotherapy practices. Evidence is presented that supports the
so-called dodo bird effect that contends that all bona fide psychotherapies are
equally effective. Two avenues for expanding research are suggested that focus
on VR strengths: creating VR playspaces (virtual environments where therapist
and client can engage playfully) and VR drama therapy. Keywords: virtual reality; drama therapy; creative expression therapy; playspace;
psychotherapy; dodo bird effect | |||
| Development of the Home Arm Movement Stroke Training Environment for Rehabilitation (HAMSTER) and Evaluation by Clinicians | | BIBA | Full-Text | 22-31 | |
| Elizabeth B. Brokaw; Bambi R. Brewer | |||
| Stroke commonly results in severe impairment of upper extremity function, which limits independence in activities of daily living. Continued and frequent use of the affected limb can result in increased function. However, long term access to therapy is frequently limited, and home exercise compliance is low. The following paper presents the design and clinician evaluation of a Kinect based home therapy system called Home Arm Movement Stroke Training Environment (HAMSTER). The development, which focused on reducing commonly observed impairments after stroke, is discussed. Additionally the system was evaluated by twelve clinicians (occupational and physical therapists) with an average of 18 years of clinical experience with individuals with chronic stroke. The clinicians were asked about commonly prescribed home exercises, and for feedback about the HAMSTER system. Although only two of the clinicians had used the Kinect previously, the clinicians reported good usability and general satisfaction with the system. All of the clinicians felt that HAMSTER would be beneficial for individuals with chronic stroke. | |||
| A Low Cost Virtual Reality System for Rehabilitation of Upper Limb | | BIBAK | Full-Text | 32-39 | |
| Pawel Budziszewski | |||
| The paper describes an on-going research aimed at creating the low cost
virtual reality based system for physical rehabilitation of upper limb. The
system is designed to assist in rehabilitation involving various kinds of limb
movement, including precise hand movements and movement of the whole extremity.
It can be used at patient's home as a telerehabilitation device. It was decided
to use the system with a motion tracking (Razer Hydra) and two alternative
display devices: head mounted displays (Sony HMZ-T1) and a LCD display with
stereovision glasses (nVidia 3DVision). The custom software was developed to
create the virtual reality environment and perform rehabilitation exercises.
Three sample rehabilitation games were created to perform assessment of the
rehabilitation system. In the preliminary research the usability of the system
was assessed by one patient. He was able to use the system for rehabilitation
exercises, however some problems with Sony HMZ-T1 usability were spotted.
During the next stages of the research extended assessment of the system's
usability and assessment of system's efficiency are planned. Keywords: physical rehabilitation; virtual reality; serious games; home based
rehabilitation; HMD; Razer Hydra; Sony HMZ-T1 | |||
| Super Pop VR™: An Adaptable Virtual Reality Game for Upper-Body Rehabilitation | | BIBAK | Full-Text | 40-49 | |
| Sergio García-Vergara; Yu-Ping Chen; Ayanna M. Howard | |||
| Therapists and researchers have studied the importance of virtual reality
(VR) environments in physical therapy interventions for people with different
conditions such as stroke, Parkinson's disease, and cerebral palsy. Most of
these VR systems do not integrate clinical assessment of outcome measures as an
automated objective of the system. Moreover, these systems do not allow
real-time adjustment of the system characteristics that is necessary to
individualize the intervention. We discuss a new VR game designed to improve
upper-arm motor function through repetitive arm exercises. An automated method
is used to extract outcome measures of upper extremity movements using the
Fugl-Meyer assessment methodology. The accuracy of the system was validated
based on trials with eighteen adult subjects. With a corresponding average
assessment error of less than 5%, the developed system shows to be a promising
tool for therapists to use in individualizing the intervention for individuals
with upper-body motor impairments. Keywords: Cerebral Palsy; virtual reality gaming environment; Fugl-Meyer assessment;
physical therapy and rehabilitation | |||
| Asynchronous Telemedicine Diagnosis of Musculoskeletal Injuries through a Prototype Interface in Virtual Reality Environment | | BIBAK | Full-Text | 50-59 | |
| Soheeb Khan; Vassilis Charissis; David Harrison; Sophia Sakellariou; Warren Chan | |||
| Telehealth provides a much needed option for remote diagnosis and monitoring
of various pathologies and patients. Remote provision of health care can offer
a two fold support for the medical system and the patients. Primarily it could
serve isolated locations and secondly it could monitor a large number of
outpatient cases directly on their homes instead of the hospital premises.
However in specific cases direct communication and visual data acquisition can
be a major obstacle. To this end we have developed a prototype system that
could enable the medical practitioners to have real-time diagnosis through 3D
captured visual and motion data. This data are recreated in a Virtual Reality
environment in the hospital facilities offering a unique system for remote
diagnosis. This paper presents the design considerations and development
process of the system and discusses the preliminary results from the system
evaluation. The paper concludes with a tentative plan of future work which aims
to offer the medical practitioners and the patient with a complete interface
which can acquire gait data and thus analyse a large variety of musculoskeletal
pathologies. Keywords: Virtual Reality; HCI; 3D Visualization; Asynchronous Diagnosis;
Telemedicine; Motion Capture | |||
| Developing a Theory-Informed Interactive Animation to Increase Physical Activity among Young People with Asthma | | BIBAK | Full-Text | 60-65 | |
| Jennifer Murray; Brian Williams; Gaylor Hoskins; John McGhee; Dylan Gauld; Gordon Brown | |||
| The current paper describes the development of a theory-informed interactive
animation and which aims to increase levels of physical activity in young
people with asthma. The project adopts a multi-disciplinary theoretical
perspective, applying knowledge from applied health research, human centred
design and psychology in order to best approach and develop a meaningful and
effective health intervention. Keywords: Asthma; interactive animation; multidisciplinary; theory-informed | |||
| The Design Considerations of a Virtual Reality Application for Heart Anatomy and Pathology Education | | BIBAK | Full-Text | 66-73 | |
| Victor Nyamse; Vassilis Charissis; J. David Moore; Caroline Parker; Soheeb Khan; Warren Chan | |||
| Anatomy and pathology of the human body are complex subjects that cannot be
elucidated easily to the medical students through traditional description and
illustration methods. The proposed interactive system aims to present clear
information on demand. For enhancing further the three-dimensional
understanding of the anatomical information, a virtual reality environment was
developed in order to accommodate different 3D models of the human body. In
this case we opted for the heart model as it presents a unique section of the
body that can produce motion and sound. The produced model was further
simplified for use by patients who wish to understand better the generic
anatomy and typical pathologies of the heart. Additionally the paper presents
the data results of the system evaluation performed by ten users. The derived
results although promising, highlighted some benefits and drawbacks of the
proposed system that we aim, to improve in the near future. Finally the paper
concludes with a plan of future work which will entail further interactivity
through audio incorporation and gesture recognition. Keywords: Virtual Reality; HCI; 3D Visualization; Heart Disease; Anatomy; Pathology | |||
| Human-Computer Confluence for Rehabilitation Purposes after Stroke | | BIBAK | Full-Text | 74-82 | |
| Rupert Ortner; David Ram; Alexander Kollreider; Harald Pitsch; Joanna Wojtowicz; Günter Edlinger | |||
| In this publication, we present a Motor Imagery (MI) based Brain-Computer
Interface (BCI) for neurologic rehabilitation. The BCI is able to control two
different feedback devices. The first one is a rehabilitation robot, moving the
fingers of the affected hand according to the detected MI. The second one
presents feedback via virtual reality (VR) to the subject. The latter one
visualizes two hands that the user sees in a first perspective view, which open
and close according to the detected MI. Four healthy users participated in
tests with the rehabilitation robot, and eleven post stroke patients and eleven
healthy users participated to tests with the VR system. We present all
subjects' control accuracy, including a comparison between healthy users and
people who suffered stroke. Five of the stroke patients also agreed to
participate in further sessions, and we explored possible improvements in
accuracy due to training effects. Keywords: Medical and healthcare; Applications: Rehabilitation | |||
| Projected AR-Based Interactive CPR Simulator | | BIBAK | Full-Text | 83-89 | |
| Nohyoung Park; Yeram Kwon; Sungwon Lee; Woontack Woo; Jihoon Jeong | |||
| In this paper, we propose a new approach of a cardiopulmonary resuscitation
(CPR) simulation system that exploits both AR-based visualization and embedded
hardware sensing techniques. The proposed system provides real-time interactive
visual feedback to the CPR trainee with the projected AR indicator plane that
visualizes results of an interlocking signal of the trainee's actions using
embedded sensors. This system also provides proper guidelines about the CPR
trainee's posture by detecting a user's articular pose from a RGB-D camera in
real-time. As implementation results, our system provides interactive feedback,
that enabling more accurate and effective training experience to the trainee
and more cost-effective rather than traditional CPR education training systems. Keywords: CPR simulation; AR-based CPR simulator; augmented reality training system;
augmented reality simulation; projected augmented reality | |||
| Affecting Our Perception of Satiety by Changing the Size of Virtual Dishes Displayed with a Tabletop Display | | BIBAK | Full-Text | 90-99 | |
| Sho Sakurai; Takuji Narumi; Yuki Ban; Tomohiro Tanikawa; Michitaka Hirose | |||
| In this paper, we propose a tabletop system for affecting our perception of
satiety and controlling energy intakes by controlling a size of a projected
image around the food. We hypothesized that ambiguous perception of satiety can
be applied to control our food intake. Given that estimating portion size is
often a relative judgment, apparent food volume is assessed according to the
size of neighboring objects such as many cutleries. Especially, the effect of
the size of dish on food intake has been debated. Based on the knowledge, we
constructed a tabletop system which projects virtual dishes around the food on
it, in order to change the assessed apparent food volume interactively. Our
results suggest that the size of virtual dish change the perception of satiety
and the amount of food consumption. Keywords: Augmented Satiety; Human Food Interaction; Cross-modal Interaction;
Augmented reality; Food Consumption | |||
| An Experience on Natural Sciences Augmented Reality Contents for Preschoolers | | BIBAK | Full-Text | 103-112 | |
| Antonia Cascales; Isabel Laguna; David Pérez-López; Pascual Perona; Manuel Contero | |||
| Early education is a key element for the future success of students in the
education system. This work analyzes the feasibility of using augmented reality
contents with preschool students (four and five years old) as a tool for
improving their learning process. A quasi experimental design based on a
nonequivalent groups posttest-only design was used. A didactic unit has been
developed around the topic "animals" by the participant teachers. The control
group followed all the didactic activities defined in the developed didactic
materials, while the experimental group was provided in addition with some
augmented reality contents. Results show improved learning outcomes in the
experimental group with respect to the control group. Keywords: augmented reality; preschool; teaching/learning process | |||
| Teaching 3D Arts Using Game Engines for Engineering and Architecture | | BIBAK | Full-Text | 113-121 | |
| Jaume Duran; Sergi Villagrasa | |||
| The main objective of this paper is to evaluate the application of 3D
virtual worlds for teaching different subjects mainly oriented to architectural
visualization and creating 3D models for multimedia. The use of 3D
technologies, multi-user virtual environments and avatars are new methodologies
for the student to have a much richer experience and therefore more motivating
for a deeper understanding of the assessment and help understand more
collaborative the projects. In this paper we work on the concept e-learning and
blended learning technologies related to interactive 3D spaces such as OpenSim,
Activeworlds, Secondlife, Unity and others. The students' participation in
these virtual 3D environments will help to understand the concept of an
architectural project and 3D creation, improving collaboration between students
and teacher, and dramatically increase in a greater understanding of the
project and a high degree of their involvement with design develop. The paper
describes the method of teaching 3D arts using Game Engines like Unity. Keywords: Virtual reality; Game engines; Visual learning | |||
| The Characterisation of a Virtual Reality System to Improve the Quality and to Reduce the Gap between Information Technology and Medical Education | | BIBAK | Full-Text | 122-131 | |
| Jannat Falah; David K. Harrison; Vassilis Charissis; Bruce M. Wood | |||
| Contemporary medical training is hindered by an excessive amount of
information provided to students through mainly traditional teaching methods
yet the younger generations are accustomed to digital data and information
on-demand. As such they have developed a fully customised manner of learning,
which in turn requires a new, innovative and equally customised teaching
method. This inherited customisation and accelerated manner of learning stems
from contemporary lifestyle trends. As such, a reduced learning curve requires
innovative and efficient teaching methods, which comply with existing
curriculums, yet facilitate the contemporary learning mantra. In particular
medical education requires a plethora of information related to the
understanding of spatial relations and the three-dimensionality of the human
body. Previous studies successfully employed Virtual Reality (VR) and high
fidelity patient simulation in order to improve and enhance the medical
education and clinical training. The benefits of this technological adoption in
the teaching field offered safer experimentation environments, reduced time and
cost. Furthermore the Virtual Reality facilities and systems can be extensively
customised with relatively low cost and be re-used for various applications.
The purpose of this paper is to identify the differences between current
education methods and the proposed technology. This research will exploit
current teaching trends and attempt to provide recommendations based on a
University of Jordan case study. Overall the paper describes the design process
of the survey questionnaire that was used for this evaluation and provides
valuable insights to both academics and practitioners regarding the potential
benefits and drawbacks of adopting such a system. Keywords: Applications: Education; Virtual Reality; system characterisation; medical
education; Middle East | |||
| A Mobile Personal Learning Environment Approach | | BIBAK | Full-Text | 132-141 | |
| Francisco José García-Peñalvo; Miguel Ángel Conde; Alberto Del Pozo | |||
| Learning and teaching processes are not restricted to an institution or a
period of time. A person can learn from experience, from the interaction with
peers, because he/she has a personal interest on something, etc. Lot of such
learning activities are today mediated by the Information and Communication
Technologies. Such technologies allow the users decide what tools and contexts
use to learn. But, in order the learning activities can be taken into account
they should be visible for the institutions. In this paper a service-based
framework to facilitate this is presented. It is specially focus on the
communication of the mobiles devices used as a learning tool with the
traditional institutional learning platforms. The framework is implemented as
an Android solution and tested by students. From these tests, it can be seen
that a mobile Personal Learning Environment is possible and its use motivates
students' participation in learning activities. Keywords: Mobile Learning; Mobile Devices; Personal Learning Environments; Android;
Web Services; Interoperability | |||
| Perceived Presence's Role on Learning Outcomes in a Mixed Reality Classroom of Simulated Students | | BIBAK | Full-Text | 142-151 | |
| Aleshia T. Hayes; Stacey E. Hardin; Charles E. Hughes | |||
| This research is part of an ongoing effort on the efficacy and user
experience of TLE TeachLivE™, a 3D mixed reality classroom with simulated
students used to facilitate virtual rehearsal of pedagogical skills by
teachers. This research investigated a potential relationship between efficacy,
in terms of knowledge acquisition and transfer, and user experience in regard
to presence, suspension of disbelief, and immersion. The initial case studies
examining user experience of presence, suspension of disbelief, and immersion
were used to develop a presence questionnaire revised from the work of Witmer
and Singer (1998) to address the TLE TeachLivE™ mixed reality
environment. The findings suggest that targeted practice, authentic scenarios,
and suspension of disbelief in virtual learning environments may impact
learning. Keywords: Mixed Reality Classroom; Simulation; Presence; Suspension of Disbelief;
Immersion; Engagement; Knowledge Acquisition; Virtual Learning | |||
| The Building as the Interface: Architectural Design for Education in Virtual Worlds | | BIBAK | Full-Text | 152-161 | |
| Luis Antonio Hernández Ibáñez; Viviana Barneche Naya | |||
| This paper focuses on architectural spatial design for virtual
tridimensional learning environments through the lens of a case study. The work
describes the design methodology of a flexible and interactive set of virtual
constructions where the architecture itself acts as a dynamic interface whose
spaces adapt to the activities that avatars carry out in their interior,
sometimes interacting with them. This approach considers the multiple
innovative parameters that have to be taken into account in the process of
cyberarchitectural design. Keywords: Metaverses; Virtual Worlds; Cyberarchitecture; V-Learning | |||
| Mixed Reality Space Travel for Physics Learning | | BIBAK | Full-Text | 162-169 | |
| Darin E. Hughes; Shabnam Sabbagh; Robb Lindgren; J. Michael Moshell; Charles E. Hughes | |||
| In this paper we describe research being conducted on a mixed reality
simulation called MEteor that is designed for informal physics learning in
science centers. MEteor is a 30 x 10 foot floor area where participants use
their bodies to interact with projected astronomical imagery. Participants walk
and run across the floor to simulate how objects move in space, and to enact
basic physics principles. Key to the success of this learning environment is an
interface scheme that supports the central metaphor of "child as asteroid."
Using video data collected in our studies we examine the extent to which
feedback mechanisms and interface conventions strengthened the metaphorical
connection, and we describe ways the interaction design can be improved for
future iterations. Keywords: STEM; mixed reality; whole-body learning; informal education; physics
simulation | |||
| Picking Up STEAM: Educational Implications for Teaching with an Augmented Reality Guitar Learning System | | BIBAK | Full-Text | 170-178 | |
| Joseph R. Keebler; Travis J. Wiltshire; Dustin C. Smith; Stephen M. Fiore | |||
| Incorporation of the arts into the current model of science, technology,
engineering, and mathematics (STEAM) may have a profound impact on the future
of education. In light of this, we examined a novel technology at the
intersection of these disciplines. Specifically, an experiment was conducted
using augmented reality to learn a musical instrument, namely the guitar. The
Fretlight® guitar system uses LED lights embedded in the fretboard to give
direct information to the guitarist as to where to place their fingers. This
was compared to a standard scale diagram. Results indicate that the
Fretlight® system led to initial significant gains in performance over a
control condition using diagrams, but these effects disappeared over the course
of 30 trials. Potential benefits of the augmented reality technology are
discussed, and future work is outlined to better understand how embodied
cognition and augmented reality can increase learning outcomes for playing
musical instruments. Keywords: STEAM; augmented reality; embodied learning; music education; Fretlight®
guitar | |||
| Virtual Reality Data Visualization for Team-Based STEAM Education: Tools, Methods, and Lessons Learned | | BIBAK | Full-Text | 179-187 | |
| Daniel F. Keefe; David H. Laidlaw | |||
| We present a discussion of tools, methods, and lessons learned from nearly
ten years of work using virtual reality data visualization as a driving problem
area for collaborative practice-based STEAM education. This work has spanned
multiple universities and design colleges. It has resulted in courses taught to
both students majoring in computer science and students majoring in art or
design. Within the classroom, an important aspect of our approach is including
art and design students directly in real scientific research, often extended
beyond the computer science aspects of data visualization to also include the
research of collaborators in biology, medicine, and engineering who provide
cutting-edge data visualization challenges. The interdisciplinary team-based
education efforts have also extended beyond the classroom as art and design
students have participated in our labs as research assistants and made major
contributions to published scientific research. In some cases, these
experiences have impacted career paths for students. Keywords: STEAM; art; science; computer science; education; virtual reality;
visualization | |||
| Architectural Geo-E-Learning | | BIBAK | Full-Text | 188-197 | |
| Ernest Redondo; Albert Sánchez Riera; David Fonseca; Alberto Peredo | |||
| This work addresses the implementation of a mobile Augmented Reality (AR)
browser on educational environments. We seek to analyze new educational tools
and methodologies, non-traditional, to improve students' academic performance,
commitment and motivation. The basis of our claim lies in the skills
improvement that students can achieve thanks to their innate affinity to
digital media features of new Smartphones. We worked under the Layar platform
for mobile devices to create virtual information channels through a database
associated to 3D virtual models and any other type of media content. The
teaching experience was carried out with Master Architecture students, and
developed in two subjects focused on the use of ICT and Urban Design. We call
it Geo-elearning because of the use of new eLearning strategies and
methodologies that incorporate geolocation, allowing receiving, sharing, and
evaluate own-generated student's proposals, on site. Keywords: Augmented reality; E-Learning; Geo-Elearning; Urban Planning; Educational
research | |||
| Mixed Reality Environment for Mission Critical Systems Servicing and Repair | | BIBAK | Full-Text | 201-210 | |
| Andrea F. Abate; Fabio Narducci; Stefano Ricciardi | |||
| Mixed Reality (MR) technologies may play an important role in assisting
on-site operators during maintenance and repair activities. Nevertheless,
industrial equipment augmentation requires a high level of precision when
co-registering virtual objects to the corresponding real counterparts. In this
paper we describe a comprehensive proposal for a mixed reality environment
aimed to improve the effectiveness of servicing and repair procedures in
mission critical systems, while reducing the time required for the
intervention. The tracking of the user's point of view exploits a multi-marker
based solution for robust and precise augmentation of the operating field. The
architecture also features a diminishing visualization strategy allowing the
user to see only the fraction of real equipment that is relevant for the
maintenance task. A finger color-based tracking provides powerful interaction
capabilities by means of a not-instrumented interface exploiting colored
fingertips caps. An evaluation study of the proposed MR environment, performed
by technicians with no previous experience of MR systems, highlights the
potential of the approach. Keywords: Mixed reality; diminished reality; finger based interaction; AR based
maintenance | |||
| Establishing Workload Manipulations Utilizing a Simulated Environment | | BIBAK | Full-Text | 211-220 | |
| Julian, IV Abich; Lauren Reinerman-Jones; Grant Taylor | |||
| Research seeking to improve the measurement of workload requires the use of
established task load manipulations to impose varying levels of demand on human
operators. The present study sought to establish task load manipulations for
research utilizing realistically complex task environments that elicit distinct
levels of workload (i.e. low, medium, and high). A repeated measures design was
used to test the effects of various demand manipulations on performance and
subjective workload ratings using the NASA-Task Load Index (TLX) and
Instantaneous Self-Assessment technique (ISA). This experiment successfully
identified task demand manipulations that can be used to investigate operator
workload within realistically complex environments. Results revealed that the
event rate manipulations had the most consistent impact on performance and
subjective workload ratings in both tasks, with each eliciting distinct levels
of workload. Keywords: Workload; simulated environments; complex systems; signal detection; change
blindness | |||
| Interactive Virtual Reality Shopping and the Impact in Luxury Brands | | BIBAK | Full-Text | 221-230 | |
| Samar Altarteer; Vassilis Charissis; David Harrison; Warren Chan | |||
| This paper investigates the impact of human-computer interaction in virtual
reality online shopping interface on the consumer experience. In particular, it
measures the effectiveness of visualising a three dimensional photorealistic
item, the real-time interactivity with the product and the real-time, fully
interactive product customization service. The proposed VR system employs a
sophisticated approach of interaction with primary objective to simplify and
improve the user experience during online shopping. The proposed interface was
evaluated through a preliminary questionnaire designed to simulate the typical
decision making process prior to a luxury object purchase. The paper presents
the outcomes of this usability trial on a group of ten luxury brands customers,
the challenges involved in the HCI design are discussed, the visual components
of the interface are presented in addition to an analysis of the system
evaluation. Adhering to the derived feedback, our future plan of work entails
additional development of the interactive tools with a view to further enhance
the system usability and user experience. Furthermore we aim to introduce more
object choices and customisation covering a larger group of luxury brands. Keywords: Virtual Reality; HCI; 3D Visualization; Luxury Marketing; Luxury Brands | |||
| Multiple Remotely Piloted Aircraft Control: Visualization and Control of Future Path | | BIBAK | Full-Text | 231-240 | |
| Gloria Calhoun; Heath Ruff; Chad Breeden; Joshua Hamell; Mark Draper; Christopher Miller | |||
| Advances in automation technology are leading to development of operational
concepts in which a single pilot is responsible for multiple remotely piloted
aircraft (RPAs). This requires design and evaluation of pilot-RPA interfaces
that support these new supervisory control requirements. This paper focuses on
a method by which an RPA's near-term future flight path can be visualized and
commanded using the stick and throttle. The design decisions driving its
symbology and implementation are described as well as preliminary quantitative
data and pilot feedback to date. Keywords: remotely piloted aircraft; unmanned air systems; flight path; display
symbology; RPA; UAS; flexible automation | |||
| The Virtual Dressing Room: A Perspective on Recent Developments | | BIBAK | Full-Text | 241-250 | |
| Michael B. Holte | |||
| This paper presents a review of recent developments and future perspectives,
addressing the problem of creating a virtual dressing room. First, we review
the current state-of-the-art of exiting solutions and discuss their
applicability and limitations. We categorize the exiting solutions into three
kinds: (1) virtual real-time 2D image/video techniques, where the consumer gets
to superimpose the clothes on their real-time video to visualize themselves
wearing the clothes. (2) 2D and 3D mannequins, where a web-application uses the
body measurements provided by the customer, to superimpose the standard sizes
to fit a customized 2D or 3D mannequin before buying. (3) 3D camera and laser
technologies which acquire 3D information of the costumer, enabling estimation
of the body shape and measurements. Additionally, we conduct user studies to
investigate the user behavior when buying clothes and their demands to a
virtual dressing room. Keywords: Human-computer interaction; virtual reality; augmented reality; user
interface design; computer graphics; interaction design; review; survey;
clothing industry; 3D imaging; 3D scanning | |||
| Making Sense of Large Datasets in the Context of Complex Situation Understanding | | BIBA | Full-Text | 251-260 | |
| Marielle Mokhtari; Eric Boivin; Denis Laurendeau | |||
| This paper presents exploration prototype tools (combining visualization and human-computer interaction aspects) developed for immersive displays in the context of the Image project. Image supports collaboration of users (i.e. experts, specialists, decision-makers...) for common understanding of complex situations by using a human guided feedback loop involving cutting-edge techniques for knowledge representation, scenario scripting, simulation and exploration of large datasets. | |||
| Evaluating Distraction and Disengagement of Attention from the Road | | BIBAK | Full-Text | 261-270 | |
| Valentine Nwakacha; Andy Crabtree; Gary Burnett | |||
| Drivers use sat nav for navigation assistance but research links sat nav
with risk of distraction [10]. Visual and cognitive workload can be increased
as drivers divert their attention from the road [1, 8]. Mitigating such risks
is vital and head-up displays (HUDs) can be beneficial [9]. HUDs present images
on the windshield to reduce diversion of drivers' attention from the road. This
paper presents a driving simulator experiment which examined how 30
participants behaved with three navigation interfaces; novel virtual car HUD,
arrow HUD and sat nav to outline potential benefits of the virtual car HUD over
the arrow HUD and sat nav. Distraction-related data (speed, headway, lane
position and peripheral detection) were gathered. The findings showed
participants were better at navigation performance and peripheral detection
with the virtual car HUD. Subjective data showed participants rated the virtual
car HUD easiest to use, least distracting and most preferred interface. Keywords: Driver distraction; head-up display; user interface design | |||
| DCS 3D Operators in Industrial Environments: New HCI Paradigm for the Industry | | BIBAK | Full-Text | 271-280 | |
| Manuel Pérez Cota; Miguel Ramón González Castro | |||
| The Distributed Control Systems (DCS) are electronic control devices used in
continuous process industry, in which the operator becomes an essential part,
and he/she should take decisions of operation that can lead to dangerous
situations and/or with heavy losses. This paper shows the work done in the
design, implementation and tests of a DCS console operator, which used 2.5D or
3D systems to facilitate the intuitive understanding of the state that it was
in the industrial process. Also explains how different input devices were used
to facilitate navigation and selection of components in the graphic display,
and how different graphical concepts (geometries, colors, animations) were
integrated in order to do the industrial process more understandable. Keywords: DCS; HCI; 3D; 2.5D; Java; OPC; Jinput; RMI | |||
| Natural Feature Tracking Augmented Reality for On-Site Assembly Assistance Systems | | BIBAK | Full-Text | 281-290 | |
| Rafael Radkowski; James Oliver | |||
| We introduce a natural feature tracking approach that facilitates the
tracking of rigid objects for an on-site assembly assistance system. The
tracking system must track multiple circuit boards without added fiducial
markers, and they are manipulated by the user. We use a common SIFT feature
matching detector enhanced with a probability search. This search estimates how
likely a set of query descriptors belongs to a particular object. The method
was realized and tested. The results show that the probability search enhanced
the identification of different circuit boards. Keywords: Augmented Reality; Natural Feature Tracking; Assembly Assistance | |||
| Augmented Reality Interactive System to Support Space Planning Activities | | BIBAK | Full-Text | 291-300 | |
| Guido Maria Re; Giandomenico Caruso; Monica Bordegoni | |||
| The Space Planning (SP) is a process that allows making an environment more
ergonomic, functional and aesthetically pleasing. The introduction of Computer
Aided tools for this kind of practice led to an increase of the quality of the
final result thanks to some versatile support used for the generation of
different options to consider for the evaluation. In particular, those based on
Augmented Reality (AR) technologies allow evaluating several options directly
in a real room. In this paper, an AR system, developed with the aim of
supporting Space Planning activities, is proposed. The system has been
developed in order to overcome some problems related to the tracking in wide
environments and to be usable in different typologies of Space Planning
environments. The paper also presents a qualitative evaluation of the AR system
in three different scenarios. The positive results obtained through these
evaluation tests show the effectiveness and the suitability of the system in
different Space Planning contexts. Keywords: Augmented Reality; Space Planning design; HCI | |||
| Empirical Investigation of Transferring Cockpit Interactions from Virtual to Real-Life Environments | | BIBAK | Full-Text | 301-309 | |
| Diana Reich; Elisabeth Dittrich | |||
| Human-cockpit interaction is an innovative and promising field of automotive
research. Indeed, automakers need to ensure safety and user satisfaction for
their cockpit development concepts, if driving and interacting occurs
simultaneously. One suggested approach is to evaluate simple cockpit prototypes
within virtual test environments. Hybrid prototyping allows a more realistic
experience with the prototype in early stages of development. With our research
study we focused on important basic parameters within hybrid test environments
(e.g. shutter glasses and virtual projected car model) and evaluated their
potential of influence. There are no hints to assume that shutter glasses
influence user behaviour. Interestingly, we found significant faster task
completion times within a virtual projected car model, which indicate that
immersive environments increase user performances. In summary, we can suggest
hybrid prototyping within immersive test environments for evaluating
human-cockpit interactions. Keywords: human-cockpit interaction; virtual environment; hybrid prototyping | |||
| Mixed and Augmented Reality for Marine Corps Training | | BIBAK | Full-Text | 310-319 | |
| Richard Schaffer; Sean Cullen; Phe Meas; Kevin Dill | |||
| The United States Marine Corps faces numerous challenges in preparing
Marines for current operations; among them are the cost of specialized training
environments and the difficulty of realistically representing the deployed
environment. This paper reports on two Office of Naval Research efforts to
address these challenges. The first employs Mixed Reality, which combines
real-world and virtual elements to create a Hollywood-set-like representation
of an Afghan village where Marines can train prior to deployment. The second
explores the use of Augmented Reality to train USMC observers. Observers are
responsible for directing artillery and mortar fires and aircraft attacks in
the proximity of friendly forces. While the live environment has numerous
advantages, the costs of supporting troops, ammunition, and equipment are
considerable. Augmented Reality can replace live supporting forces, resulting
in lower cost use of training areas during down time and enabling almost any
area to become an augmented training area. Keywords: Mixed Reality; Augmented Reality | |||
| Proactive Supervisory Decision Support from Trend-Based Monitoring of Autonomous and Automated Systems: A Tale of Two Domains | | BIBAK | Full-Text | 320-329 | |
| Harvey S. Smallman; Maia B. Cook | |||
| The digital technology revolution continues to roil work domains. An influx
of automation and autonomous systems is transforming the role of humans from
operators into supervisors. For some domains, such as process control,
supervisory control is already the norm. For other domains, such as military
command and control, the transformation to autonomous supervision is just
beginning. In both domains, legacy operation-centric, real-time data displays
and tools provide inadequate task support, leading to unproductive user
work-arounds. They give rise to a reactive monitoring stance, and will not
scale to meet the new, different task needs. We review advanced display design
projects in each domain that, in contrast, provide proactive supervisory
decision support. We identified key perceptual and cognitive challenges in
supervision, and applied cognitive science concepts to the design of novel
trend-based interfaces. We drew lessons from process control to combat the
challenges likely to arise in military command and control. Keywords: Visualization; automation; supervisory control; decision support; proactive
monitoring; cognitive science; human factors; user-centered design | |||
| The ART of CSI: An Augmented Reality Tool (ART) to Annotate Crime Scenes in Forensic Investigation | | BIBAK | Full-Text | 330-339 | |
| Jan Willem Streefkerk; Mark Houben; Pjotr van Amerongen; Frank ter Haar; Judith Dijk | |||
| Forensic professionals have to collect evidence at crime scenes quickly and
without contamination. A handheld Augmented Reality (AR) annotation tool allows
these users to virtually tag evidence traces at crime scenes and to review,
share and export evidence lists. In an user walkthrough with this tool, eight
end-users annotated a virtual crime scene while thinking aloud. Qualitative
results show that annotation could improve orientation on the crime scene,
speed up the collection process and diminish administrative pressure. While the
current prototype suffered from technical limitations due to slow feature
tracking, AR annotation was found to be a promising, usable and valuable tool
in crime scene investigation. Keywords: Augmented Reality; Indoor Positioning; User Walkthrough; Forensics | |||
| The Virtual Reality Applied in Construction Machinery Industry | | BIBAK | Full-Text | 340-349 | |
| Yun-feng Wu; Ying Zhang; Jun-wu Shen; Tao Peng | |||
| Nowadays, the competition in the construction machinery industry is
increasingly fierce. So how to realize the fastest speed to market, best
quality, lowest cost, and best service are key factors for enterprises to win
the market and users. Advanced technology can do help. It is in this context
that VR (Virtual Reality) steps in, applied by manufacturers as a sharp weapon.
Sany Heavy Industry is China's largest and the world's sixth construction
machinery manufacturer. As a full-scale enterprise, Sany always emphasizes
digitization and information technology for the construction of a modern
enterprise. VR has been integrated into the enterprise's workflow. In R&D
department, VR is commonly used in digital prototyping. But this paper mainly
introduces Sany's own virtual reality roaming platform: VR Flier, developed to
make significant applications in simulation of plant layout and process
planning. And service and marketing departments have also received a
significant effect via VR&AR applications. Keywords: construction machinery industry; Virtual Reality application; digital
manufacturing; marketing and services | |||
| On the Use of Augmented Reality Technology for Creating Interactive Computer Games | | BIBAK | Full-Text | 353-362 | |
| Chin-Shyurng Fahn; Meng-Luen Wu; Wei-Tyng Liu | |||
| In this paper, we design interactive games systems that adopt augmented
reality (AR) technology. By virtue of a conventional webcam for capturing
source images, we develop real-time visual tracking techniques based on edge
detection and make 3D virtual objects display on our defined markers that are
within the source images in the field of view (FOV) of the webcam. Two kinds of
gaming interfaces are created for example: one is an AR based Monopoly game,
and the other is an AR based fighting game. There are five classic human
computer interface design methods considered to create the above AR based game
systems. In the example of Monopoly games, we demonstrate how a traditional
table game can be turned into an interactive computer game using the AR
technology. We also list the advantages of a marker based approach and state
why it is suitable for the interactive computer game. Further, the existing
popular game consoles with different gaming interfaces are compared to the two
AR based game systems. The comparison results reveal that our proposed AR based
game systems are lower in cost and better in extensibility. Keywords: Augmented reality; human computer interface; AR based game system;
interactive computer game; marker recognition | |||
| A 3-D Serious Game to Simulate a Living of a Beehive | | BIBAK | Full-Text | 363-371 | |
| José Eduardo M. de Figueiredo; Vera Maria B. Werneck; Rosa Maria E. Moreira da Costa | |||
| Computational tools are increasingly supporting the learning process in
several areas. They open new opportunities for teachers to teach contents and
interact with their students. This group of tools includes simulations based on
multi-agent systems. This work aims to present a simulation game to study the
population growth of a beehive. System variables can be changed in order to
analyze different results. Aspects such as duration and time of flowering can
be manipulated by the student. The multi-agent approach in Distributed
Artificial Intelligence has been chosen to automatically control the activities
of the application. Virtual Reality is used to illustrate the behavior of the
bees that in general, are not able to be seen in the real world or through
mathematical simulation. Keywords: Simulation; Virtual Reality; Multi-Agents Systems; Serious Games | |||
| Presentation of Odor in Multi-Sensory Theater | | BIBAK | Full-Text | 372-379 | |
| Koichi Hirota; Yoko Ito; Tomohiro Amemiya; Yasushi Ikei | |||
| This paper discusses an approach to implement and evaluate odor display,
with the goal of using it in multi-sensory theaters. A display system that
mixes odors with an arbitrary ratio was developed, and a sensor system that is
capable of measuring the concentration in a relatively short time period using
a sample and hold function was devised. Experiments clarified the time delay
and attenuation of the concentration in the transmission of an odor from the
display to a user, and the feasibility of utilizing a quantitative mixture of
odors was confirmed. Keywords: Underlying & supporting technologies: Multimodal interfaces; Olfactory
Display; Multi-sensory Theater | |||
| Using Motion Sensing for Learning: A Serious Nutrition Game | | BIBAK | Full-Text | 380-389 | |
| Mina C. Johnson-Glenberg | |||
| A mixed reality game was created to teach middle and high school students
about nutrition and the USDA My Plate icon. This mixed reality game included
both digital components (projected graphics on the floor) and tangible,
physical components (motion tracking wands that were handheld). The game goal
was to feed the alien the healthiest food item from a pair of items. Students
learned about the amount of nutrients and optimizers in the digital food items
and practiced making rapid food decisions. In the final level of the game
players interacted with My Plate and each food item filled the appropriate
quadrant in real time. Nineteen 4th graders played through the game in one 1.5
hour session. Significant learning gains were seen on a pretest and posttest
that assessed nutrition knowledge, paired t(18) = 4.13, p < .001. We support
the need for call for more embodied games that challenge children to practice
making quick food choice decisions and we explore how motion capture games can
affect engagement, health behaviors, and knowledge outcomes. Keywords: Applications: Education; Mixed Reality; Nutrition and Exer-Games | |||
| AR'istophanes: Mixed Reality Live Stage Entertainment with Spectator Interaction | | BIBAK | Full-Text | 390-399 | |
| Thiemo Kastel; Marion Kesmaecker; Krzysztof Mikolajczyk; Bruno Filipe Duarte-Gonçalves | |||
| Mixed Reality and Augmented Reality for live stage productions have been
used ever more frequently by artists over the past few years. AR'istophanes is
an experimental stage production aimed at bringing the new technical
possibilities of Mixed and Augmented Reality to the stages of this world. This
document describes the first phase of pre-production from 2011 to 2012 and
demonstrates the possibilities of integrating motion capturing and 3D
animation. This also includes the use of Smartphone Apps and real-time
rendering. Audience interaction is a key focus in this production -- which
means technical approaches are demonstrated and opinions were collected from
potential viewers. Keywords: Mixed Reality; Augmented Reality; Interaction; Theatre; Live Entertainment;
Optical See-Through Glasses | |||
| System Development of Immersive Technology Theatre in Museum | | BIBAK | Full-Text | 400-408 | |
| Yi-Chia Nina Lee; Li-Ting Shan; Chien-Hsu Chen | |||
| Varieties of museum theatres include historical characters, puppetry,
movement, music, etc. Visitors can experience the storyline of knowledge about
history, science and technology in the theatre which creates immersive
environment and engages experiences, such as those in performance, games and
simulation. With this kind of experience, knowledge learning in museums becomes
more effective and interesting. In addition, it requires multiple disciplines
to accomplish the multi-sensory experience provided in the theatre environment.
This article focuses on the design process and the system development of the
Immersive Theatre in a systematical method. There are three phases in the
process: Design, Configuration Negotiation, and Implementation. Keywords: immersive technology; museum theatre; Configuration negotiation | |||
| An Immersive Environment for a Virtual Cultural Festival | | BIBA | Full-Text | 409-415 | |
| Liang Li; Woong Choi; Kozaburo Hachimura | |||
| This paper describes the development of a virtual reality (VR) system and the use of an immersive environment for a traditional Japanese virtual cultural festival. With the development of computer graphics (CG) and VR technologies, extensive researches have been carried out on digital archiving of cultural assets. The goals of our Virtual Yamahoko Parade project are to record and preserve digitally the Yamahoko Parade of the Gion Festival, an intangible traditional grand scale cultural event, as well as to open the product to the public. Therefore, not only the quality of the VR contents but also the display and demonstration are important to reproduce the atmosphere of the festival. The proposed system combines vision, sound, immersive display, and real time interaction, which enables the users to feel as if they are actually participating in the parade. | |||
| Mission: LEAP | | BIBAK | Full-Text | 416-425 | |
| Christopher Stapleton; Atsusi Hirumi; Dana S. Mott | |||
| Mixed Reality (MR) melts more than boundaries between realities. MR also
melts boundaries between disciplines to stimulate innovation. In a project
originally sponsored by NASA, the authors of this paper discuss the case study
Mission:LEAP, a Mixed Reality Experiential Learning Landscape. In achieving the
core objective of building innovation competencies in youth, we had to expand
Space STEM education to include the Arts, Media, Design and Humanities to teach
innovation competencies. By play-testing a full-scale mock-up, the process also
revealed the value of MR in experiential learning landscapes and defined new
aspirations and requirements for innovative ways of how we interface with MR
environments in free-choice learning venues. Keywords: Innovation; Mixed Reality; Informal Education; Interplay; Phydgital
InterSpace | |||
| ChronoLeap: The Great World's Fair Adventure | | BIBAK | Full-Text | 426-435 | |
| Lori C. Walters; Darin E. Hughes; Manuel Gértrudix Barrio; Charles E. Hughes | |||
| ChronoLeap: The Great World's Fair Adventure utilizes the educational
potential of immersive 3D virtual venues for children and early adolescents
between 9 and 13. Virtual reality environments transport the mind beyond the 2D
bounds of text or photographs; they engage the imagination and can be a
powerful tool for conveying educational content [1]. ChronoLeap leverages these
innate qualities and weaves together the individual threads of single
disciplines into a multi-disciplinary tapestry of web-based exploration through
the 1964/65 New York World's Fair. Through their myriad of pavilions and
exhibits, World Fairs offer links to science, technology, engineering,
mathematics, art and humanities topics. ChronoLeap provides an immersive 3D
environment with highly accurate and detailed models, and merges it with games
and themes designed to provide users an educational STEAM environment. The
project is a collaborative effort between the University of Central Florida,
Queens Museum of Art and New York Hall of Science. Keywords: STEAM; STEM; Immersive Education; virtual environments; virtual heritage;
interdisciplinary; 1964/65 New York World's Fair | |||
| The Electric Bow Interface | | BIBAK | Full-Text | 436-442 | |
| Masasuke Yasumoto; Takashi Ohta | |||
| The research intends the establishment of the cognitive space where the
force feedback, the sense of immersion and existence are implemented in
interactive space functioned as the input of physical movements. The Electronic
Bow Interface system has been developed based on a Japanese archery with
consideration to apply the application of games such as FPS (First Person
Shooting) into it; furthermore, contents are also developed with the Electronic
Bow Interface system. This research also attempts the actualization of the
realistic forced feedback and the interface where physical movement can be
reflected. Keywords: interface; game; interactive art | |||