| Guest editorial -- Collaborative virtual environments | | BIB | Full-Text | 1-3 | |
| E. Churchill; D. Snowdon | |||
| An active worlds interface to Basic Support for Cooperative Working (BSCW) to enhance chance encounters | | BIBAK | Full-Text | 4-14 | |
| A. Huxor | |||
| New ways of working, as exemplified by distance learning, telecommuting and
the virtual organisation, are growing in popularity. This paper concerns itself
with the role that 3D virtual environments can play in assisting such
collaborative working. Specifically, chance encounters have been shown to be
important in collaboration, that is, encounters that are not pre-arranged by
its participants. There are a number of tools to facilitate encounters online,
but these create new problems. It is argued that 3D shared spaces can assist in
the management of chance encounters, allowing them to create a situation
similar to that found in the traditional workplace, in which tasks and content
are situated in locales. If shared 3D spaces are to have utility for computing
in general, rather than specific applications, it is suggested that this may be
in such spatial management of encounters. An example, in which Active Worlds is
employed as an interface to Basic Support for Cooperative Working (BSCW)
content is described. This content creates the motivation for users to be
within the space, and thus available for chance encounters with other users;
their nature and extent of being spatially coordinated. Keywords: CSCW; Shared spaces; Chance encounters; BSCW; Active Worlds | |||
| Spin: A 3D interface for cooperative work | | BIBAK | Full-Text | 15-25 | |
| C. Dumas; G. Saugis; S. Degrande; P. Plénacoste; C. Chaillou; M. Viaud | |||
| In this paper, we present a three-dimensional user interface for synchronous
co-operative work, Spin, which has been designed for multi-user synchronous
real-time applications to be used in, for example, meetings and learning
situations. Spin is based on a new metaphor of virtual workspace. We have
designed an interface, for an office environment, which recreates the
three-dimensional elements needed during a meeting and increases the user's
scope of interaction. In order to accomplish these objectives, animation and
three-dimensional interaction in real time are used to enhance the feeling of
collaboration within the three-dimensional workspace. Spin is designed to
maintain a maximum amount of information visible. The workspace is created
using artificial geometry -- as opposed to true three-dimensional geometry --
and spatial distortion, a technique that allows all documents and information
to be displayed simultaneously while centring the user's focus of attention.
Users interact with each other via their respective clones, which are
three-dimensional representations displayed in each user's interface, and are
animated with user action on shared documents. An appropriate object
manipulation system (direct manipulation, 3D devices and specific interaction
metaphors) is used to point out and manipulate 3D documents. Keywords: Synchronous CSCW; CVE; Avatar; Clone; Three-dimensional interface; 3D
interaction | |||
| Virtual environments for work, study and leisure | | BIBAK | Full-Text | 26-37 | |
| I. Tomek; R. Giles | |||
| Virtual environments have the potential of adding a new dimension to the
concept of a community. In this paper, we describe our work on a text-based
virtual environment. Although our focus is on work applications, the
environment is equally suited for educational and recreational uses. The paper
is presented in the context of the needs of a software development team but can
be applied to other work teams as well. Two essential characteristics of
successful work teams are good communication and efficient access to project
information. Maintaining both of these becomes more and more difficult as the
size of the team grows, and gets very difficult when the team is geographically
dispersed. We describe a project that could be used to test the hypothesis that
a collaborative environment using text-based virtual reality would alleviate
these problems by relieving physical separation through virtual proximity. In
the first phase of the project, we adapted and extended Jersey, a
Smalltalk-based MOO (Multi-user domain Object Oriented) with collaborative
virtual environment (CVE) features. On the basis of our experience, we then
started designing and implementing MUM, a Multi-Universe MOO. When completed,
the more extendable and customisable MUM will provide novel features and
encourage developer experimentation. We describe some of MUM's features and our
plans for it. Keywords: Collaboration over distance; Collaborative work; Corporate memory; MOO;
Virtual environments; Work teams | |||
| Human-systems interaction for virtual environment team training | | BIBAK | Full-Text | 38-48 | |
| L. McCarthy; R. Stiles; W. L. Johnson; J. Rickel | |||
| Virtual environments are increasingly used to support collaborative
activities and distance learning. However, few are designed to support
students, instructors and simulations in multi-participant team training. This
paper describes the Training Studio, a system for authoring, delivering and
evaluating multi-participant team training in an immersed virtual environment.
The Training Studio focuses on human-systems interaction, allowing multiple
students to learn and perform team tasks. The Training Studio supports
collaborative learning for either single or multi-participant activity. This is
accomplished through the use of agents which are assigned to students to act as
personal mentors or missing team members. Conducting team training within a
virtual environment introduces complexities and issues unique to team training
and multiple-participant virtual environments. This paper describes our
approach to virtual environment team training, discussing issues confronted and
resulting design considerations and implementations. Keywords: Virtual environments; Team training; Intelligent tutoring systems;
Instructional simulations, Autonomous agents; Human-systems interaction | |||
| Nonverbal communication interface for collaborative virtual environments | | BIBAK | Full-Text | 49-59 | |
| A. Guye-Vuillème; T. K. Capin; S. Pandzic; N. Magnenat Thalmann | |||
| Nonverbal communication is an important aspect of real-life face-to-face
interaction and one of the most efficient ways to convey emotions, therefore
users should be provided the means to replicate it in the virtual world.
Because articulated embodiments are well suited to provide body communication
in virtual environments, this paper first reviews some of the advantages and
disadvantages of complex embodiments. After a brief introduction to nonverbal
communication theories, we present our solution, taking into account the
practical limitations of input devices and social science aspects. We introduce
our sample of actions and implementation using our VLNET (Virtual Life Network)
networked virtual environment and discuss the results of an informal evaluation
experiment. Keywords: Nonverbal communication; Embodiments; Networked virtual environments; Social
interaction in virtual environments; Emotional feedback | |||
| Constructing social systems through computer-mediated communication | | BIBAK | Full-Text | 60-73 | |
| B. Becker; G. Mark | |||
| The question of whether computer-mediated communication can support the
formation of genuine social systems is addressed in this paper. Our hypothesis,
that technology creates new forms of social systems beyond real-life milieus,
includes the idea that the technology itself may influence how social binding
emerges within online environments. In real-life communities, a precondition
for social coherence is the existence of social conventions. By observing
interaction in virtual environments, we found the use of a range of social
conventions. These results were analysed to determine how the use and emergence
of conventions might be influenced by the technology. One factor contributing
to the coherence of online social systems, but not the only one, appears to be
the degree of social presence mediated by the technology. We suggest that
social systems can emerge by computer-mediated communication and are shaped by
the media of the specific environment. Keywords: Collaborative virtual environments; Social conventions; Virtual communities;
Social presence; Avatars | |||
| Empathy online | | BIBAK | Full-Text | 74-84 | |
| J. Preece | |||
| Members of online support communities help each other by empathising about
common problems and exchanging information about symptoms and treatments.
Results from two studies indicate that: empathy occurs in most online textual
communities; empathetic communication is influenced by the topic being
discussed; the presence of women tends to encourage empathy; and the presence
of moderators not only reduces hostility but also appears to encourage empathy.
The paper explores how empathy may be affected by pace of interaction, mode of
expression and the way people reveal themselves in synchronous and asynchronous
communities. As we advance towards technically better computer virtual
environments, it is timely to pay greater attention to social issues such as
empathetic communication. Keywords: Online community; Empathy; Bulletin board; Asynchronous; Synchronous;
Computer virtual environment | |||
| Natural language control of interactive 3D animation and computer games | | BIBAK | Full-Text | 85-102 | |
| M. Cavazza; I. Palmer | |||
| In this paper we describe a fully implemented system for speech and natural
language control of 3D animation and computer games. The experimental framework
has features that have been emulated from the popular DOOM™ computer
game. It implements an integrated parser based on a linguistic formalism
tailored to the processing of the specific natural language instructions
required to control a player character. This parser outputs structured message
formats to the animation layer, which further interprets these messages to
generate behaviours for the scene objects. We have found that interactive
control significantly impacts on the behavioural interpretation of natural
language semantics. Besides bringing stringent requirements in terms of
response times for the natural language processing step, it determines the
level of autonomy that the animated character should possess, which in turn
influences the generation of behavioral scripts from natural language
instructions. Keywords: Virtual environments; Natural language; Animation; Games | |||
| Creating natural language interfaces to VR systems | | BIBAK | Full-Text | 103-113 | |
| S. S. Everett; K. Wauchope; M. A. Pérez Quiñones | |||
| Two research projects are described that explore the use of spoken natural
language interfaces to virtual reality (VR) systems. Both projects combine
off-the-shelf speech recognition and synthesis technology with in-house command
interpreters that interface to the VR applications. Details about the
interpreters and other technical aspects of the projects are provided, together
with a discussion of some of the design decisions involved in the creation of
speech interfaces. Questions and issues raised by the projects are presented as
inspiration for future work. These issues include: requirements for object and
information representation in VR models to support natural language interfaces;
use of the visual context to establish the interaction context; difficulties
with referencing events in the virtual world; and problems related to the
usability of speech and natural language interfaces in general. Keywords: Natural language processing; Natural language interfaces; Speech interfaces;
Speech interface design; Speech recognition; Virtual reality | |||
| Spoken language and multimodal applications for electronic realities | | BIBAK | Full-Text | 114-128 | |
| A. Cheyer; L. Julia | |||
| We use the term 'electronic reality' (ER) to encompass a broad class of
concepts that mix real-world metaphors and computer interfaces. In our
definition, 'electronic reality' includes notions such as virtual reality,
augmented reality, computer interactions with physical devices, interfaces that
enhance 2D media such as paper or maps, and social interfaces where computer
avatars engage humans in various forms of dialogue. One reason for bringing
real-world metaphors to computer interfaces is that people already know how to
navigate and interact with the world around them. Every day, people interact
with each other, with pets, and sometimes with physical objects by using a
combination of expressive modalities, such as spoken words, lone of voice,
pointing and gesturing, facial expressions, and body language. In contrast,
when people typically interact with computers or appliances, interactions are
unimodal, with a single method of communication such as the click of a mouse or
a set of keystrokes serving to express intent. In this article, we describe our
efforts to apply multimodal and spoken language interfaces to a number of ER
applications, with the goal of creating an even more 'realistic' or natural
experience for the end user. Keywords: Electronic realities; Multimodality; National interaction | |||
| ICOME: An Immersive Collaborative 3D Object Modelling Environment | | BIBAK | Full-Text | 129-138 | |
| C. Raymaekers; T. De Weyer; K. Coninx; F. Van Reeth; E. Flerackers | |||
| In most existing immersive virtual environments, 3D geometry is imported
from external packages. Within ICOME (an Immersive Collaborative 3D Object
Modelling Environment) we focus on the immersive construction of 3D geometrical
objects within the environment itself. Moreover, the framework allows multiple
people to simultaneously undertake 3D modelling tasks in a collaborative way.
This article describes the overall architecture, which conceptually follows a
client/server approach. The various types of clients, which are implemented,
are described in detail. Some illustrative 3D object modelling examples are
given. Extensions to the system with regard to 3D audio are also mentioned. Keywords: Immersive modelling; Collaborative work; Virtual reality; Networking; User
interfacing | |||
| Calibration of electromagnetic tracking devices | | BIBAK | Full-Text | 139-150 | |
| V. Kindratenko | |||
| Electromagnetic tracking devices are often used to track location and
orientation of a user in a virtual reality environment. Their precision,
however, is not always high enough because of the dependence of the system on
the local electromagnetic field which can be altered easily by many external
factors. The purpose of this article is to give an overview of the calibration
techniques used to improve the precision of the electromagnetic tracking
devices and to present a new method that compensates both the position and
orientation errors. It is shown numerically that significant improvements in
the precision of the detected position and orientation can be achieved with a
small number of calibration measurements to be taken. Unresolved problems and
research topics related to the proposed method are discussed. Keywords: Electromagnetic tracker; Tracker calibration; Polynomial fit; Virtual
reality | |||
| Foreword: Applications of virtual environments and wearable computers for medicine | | BIB | Full-Text | 151 | |
| The Wearable Motherboard™: The first generation of adaptive and responsive textile structures (ARTS) for medical applications | | BIBAK | Full-Text | 152-168 | |
| C. Gopalsamy; S. Park; R. Rajamanickam; S. Jayaraman | |||
| Virtual reality (VR) has been making inroads into medicine in a broad
spectrum of applications, including medical education, surgical training,
telemedicine, surgery and the treatment of phobias and eating disorders. The
extensive and innovative applications of VR in medicine, made possible by the
rapid advancements in information technology, have been driven by the need to
reduce the cost of healthcare while enhancing the quality of life for human
beings.
In this paper, we discuss the design, development and realisation of an innovative technology known as the Georgia Tech Wearable Motherboard™ (GTWM), or the "Smart Shirt". The principal advantage of GTWM is that it provides, for the first time, a very systematic way of monitoring the vital signs of humans in an unobtrusive manner. The flexible databus integrated into the structure transmits the information to monitoring devices such as an EKG machine, a temperature recorder, a voice recorder, etc. GTWM is lightweight and can be worn easily by anyone, from infants to senior citizens. We present the universal characteristics of the interface pioneered by the Georgia Tech Wearable Motherboard™ and explore the potential applications of the technology in areas ranging from combat to geriatric care. The GTWM is the realisation of a personal information processing system that gives new meaning to the term ubiquitous computing. Just as the spreadsheet pioneered the field of information processing that brought "computing to the masses", it is anticipated that the Georgia Tech Wearable Motherboard™ will bring personalised and affordable healthcare monitoring to the population at large. Keywords: Healthcare; Intelligent clothing; Universal interface; Vital signs; Wearable
information infrastructure; "Wearable Motherboard™" | |||
| Virtual reality in vestibular assessment and rehabilitation | | BIBAK | Full-Text | 169-183 | |
| S. Di Girolamo; W. Di Nardo; P. Picciotti; G. Paludetti; F. Ottaviani | |||
| Previous experiences on vestibular compensation showed that multisensorial
stimulations affect postural unbalance recovery. Virtual Environment (VE)
exposure seems very useful in vestibular rehabilitation, since the experience
gained during VE exposure is transferable to the real world. The rearrangement
of the hierarchy of the postural cues was evaluated in 105 patients affected by
visual, labyrinthic and somatosensory pathology in normal conditions and during
sensorial deprivation. They were divided into five groups according to
pathology and compared to 50 normal controls. Our data show that VE exposure is
a reliable method to identify the deficient subsystem and the level of
substitution. Moreover, Virtual Reality (VR) would accelerate the compensation
of an acute loss of labyrinthine function, related to adaptive modifications of
the vestibulo-ocular and vestibulo-spinal reflexes, overstimulating the
residual labyrinthine function. The residual labyrinthine function is poor in
chronic bilateral vestibular deficit and VE exposure should provide sensory
substitution or sensory motor reorganisation, thereby modulating the external
spatial reference and promoting the reorganisation of the multiple sensory
input. The potential for VE exposure perspectives seems very promising when
dealing with the vestibular system where there is a continuous rearrangement of
different sensorial informations as a result of environmental and age-related
changes. Keywords: Postural control; Virtual Reality; Vestibular assessment; Vestibular
rehabilitation | |||
| Ophthalmoscopic examination training using virtual reality | | BIBAK | Full-Text | 184-191 | |
| D. Lee; M. Woo; D. Vredevoe; J. Kimmick; W. J. Karplus; D. J. Valentino | |||
| Health care professionals perform ophthalmoscopic examinations to detect
pathologies of the eye, as well as to evaluate the effects of other diseases,
such as high-blood pressure and diabetes. The ophthalmoscopic examination is
given using an ophthalmoscope, a hand-held instrument consisting of an
adjustable lens and a focused beam of light. The difficulty of the procedure
lies in positioning the ophthalmoscope accurately and then correctly
identifying the ocular disease symptoms -- skills that improve with experience.
To improve and accelerate the training of the student, we developed a Virtual
Ophthalmoscopic Examination, a three-dimensional real-time computer simulation
of the ophthalmoscopic procedure using virtual reality techniques. By
navigating and manipulating the virtual ophthalmoscope in the simulation
environment, the student learns how to position the instrument properly. Unlike
other training aids that use photographic slides to show the full retina, the
Virtual Opthalmoscopic Examination programme simulates an accurate view of the
retina. By increasing the realism of the training, the transition from the
training programme to live examination of patients will become less difficult.
The programme was evaluated by graduate nursing students and was shown to be a
promising training aid. Keywords: Computer-assisted instruction; Medical education; Ophthalmoscopy | |||
| Virtual risks: Rich domain risk and technology transfer failure as design criteria in the Sheffield Knee Arthroscopy Trainer (SKATS) | | BIBAK | Full-Text | 192-202 | |
| Professor J. G. Arthur; A. D. McCarthy; C. Baber; P. J. Harley | |||
| In this paper an example of Virtual Reality (VR) system design in a
safety-critical training domain is discussed. In particular, a model for design
is presented. This model seeks to create operational definitions of risk in the
surgical domain. Perhaps more importantly, it also seeks to discover
operational predictors of the risk of technology-transfer failure as a
fundamental requisite for the early design. Typically both of these activities
do take place in some form in most designs, but they are frequently
III-conceived due to inappropriate timing, low importance, insufficient
methodological rigour and the absence of a pre-existent integration model.
Using examples from the Sheffield Knee Arthroscopy Training System (SKATS), we
will discuss the contention that equal research effort needs to be spent on
core design issues as on the technological VR design. Specifically, we will
propose a set of guidelines for the research and development of risk metrics in
Virtual Environment (VE) design and technology-transfer for safety-critical
training. Keywords: Virtual reality; Surgical training; Simulation; Design; Technology-transfer;
Risk | |||
| Banded matrix approach to Finite Element modelling for soft tissue simulation | | BIBAK | Full-Text | 203-212 | |
| J. Berkley; S. Weghorst; H. Gladstone; G. Raugi; D. Berg; M. Ganter | |||
| Realistic deformation of computer-simulated anatomica structures is
computationally intensive. As a result, simple methodologies not based in
continuum mechanics have been employed for achieving real-time deformation of
virtual anatomy. Since the graphical interpolations and simple spring models
commonly used in these simulations are not based on the biomechanical
properties of tissue structures, these 'quick and dirty" methods typically do
not represent accurately the complex deformations and force-feedback
interactions that can take place during surgery. Finite Element (FE) analysis
is widely regarded as the most appropriate alternative to these methods.
Extensive research has been directed toward applying this method to modelling a
wide range of biological structures, and a few simple FE models have been
incorporated into surgical simulations. However, because of the highly
computational nature of the FE method, its direct application to real-time
force-feedback and visualisation of tissue deformation has not been practical
for most simulations. This limitation is due primarily to the overabundance of
information provided by the standard FE approaches. If the mathematics are
optimised through preprocessing to yield only the information essential to the
simulation task, run-time computation requirements can be reduced drastically.
We are currently developing such methodologies, and have created computer
demonstrations that support real-time interaction with soft tissue. To
illustrate the efficacy and utility of these fast "banded matrix" FE methods,
we present results from a skin suturing simulator which we are developing on a
PC-based platform. Keywords: Virtual reality; Surgical simulation; Real-time analysis; Finite element
modelling; Haptic feedback; Soft tissue | |||
| Using virtual reality techniques in maxillofacial surgery planning | | BIBAK | Full-Text | 213-222 | |
| P. Neumann; D. Siebert; A. Schulz; G. Faulkner; M. Krauss; T. Tolxdorff | |||
| The primary goal of our research has been to implement an entirely
computer-based maxillofacial surgery planning system [1]. An important step
toward this goal is to make virtual tools available to the surgeon in order to
carry out a three-dimensional (3D) cephalometrical analysis and to
interactively define bone segments from skull and jaw bones. An easy-to-handle
user interface employs visual and force-feedback devices to define subvolumes
of a patient's volume dataset [2]. The defined subvolumes, together with their
spatial arrangements based on the cephalometrical results, eventually lead to
an operation plan. We have evaluated modern low-cost, force-feedback devices
with regard to their ability to emulate the surgeon's working procedure. Once
the planning of the procedure is complete, the planning results are transferred
to the operating room. In our intra-operative concept the visualisation of
planning data is speech controlled by the surgeon and correlated with the
patient's position by an electromagnetic 3D sensor system. Keywords: Surgery planning; Volume segmentation; Virtual tools; Force feedback;
Intra-operative navigation | |||
| Application of virtual reality in hospital facilities design | | BIBAK | Full-Text | 223-234 | |
| V. Giallorenzo; P. Banerjee; L. Conroy; J. Franke | |||
| The airborne particles present in certain hospital environments, such as the
tuberculosis isolation or operating rooms, can be extremely harmful for
patients and/or hospital personnel. An important issue during the design of
hospital facilities is an efficient airborne particle removal system. A
near-optimal setup of the parameters that affect the airflow, and consequently
the airborne particle trajectories within the room is desirable. Computational
Fluid Dynamics (CFD) is an alternative to tedious and time-consuming
experimental investigations during the design phase, when a large number of
alternatives need to be evaluated. The main limitations of CFD application in
building design are the high level of skill required, the complexity of the
setup phase, and the difficulty of output data interpretation using common 2D
(two-dimensional) display devices. A virtual reality (VR) environment can help
in overcoming some of these limitations. A CFD/VR procedure for design of
contaminant-free hospital facilities is presented in this paper. By means of a
VR preprocessing step, inferior solutions can be discharged to drastically
reduce the number of configurations to investigate. Then, a CFD/VR tool is used
to explore the restricted set of room layouts. The 3D (three-dimensional),
immersive visualisation of an indoor space and of the particle motion inside it
allows the user to really see the particle flows and consequently understand
the effects of room parameters on particle motion throughout the room. In this
way a close-to-optimal configuration of the room layout and of the ventilation
system can be achieved more speedily and more conveniently compared to
traditional CFD investigations. Keywords: Virtual reality; CFD; Building layout; Hospital facilities design | |||
| Simulation of frontal orbital advancement | | BIBAK | Full-Text | 235-240 | |
| H. A. Grabowski; S. Hassfeld; R. Krempien; J. Münchenberg; J. Brief | |||
| In this paper, we present a system for performing a complex surgical
intervention using virtual reality (VR) technology. With the aid of the system,
the intervention can be planned and simulated exactly before performing it in
reality and important additional information can be achieved during the
simulation. Before working in VR, finite element models of the patient's head
are generated form CT-images. Based on these models, additional work is done in
VR, where the patient's skull is cut into several pieces, which are then
re-positioned. Based on moving and shifting the obtained pieces, the goal is to
increase the volume inside the skull, which is called intracranial volume.
Until now, it was not possible to measure the achieved increase of the
intracranial volume. However, by using our system is it now possible to
calculate this volume online during each step of our virtual intervention. The
obtained results are used for the surgical intervention in reality. Keywords: Mesh generation; Geometric modelling; finite element models; Virtual
cutting; VR-based surgery; Volume measurement | |||
| Editorial | | BIB | Full-Text | 241-242 | |
| The reality of medical work: The case for a new perspective on telemedicine | | BIBAK | Full-Text | 243-249 | |
| R. Rajani; M. Perry | |||
| This paper considers the nature of medical work and how new telemedicine
technologies can be developed to support that work. Telemedicine developers
attempt to increase communication and collaboration between medical
practitioners and between patients and medics, with the goal being to make
medical care and information more easily accessible. However, the focus of
telemedicine systems appears to have so far been technology centred, and the
work they are trying to support is often ignored. We argue that to develop
appropriate telemedicine technologies, it is important to understand the nature
of medical work, and to examine the manner in which medical practice actually
occurs. Only then will we be in a position to design appropriate telemedicine
technologies to support these activities. Unless designers have an insight into
the work itself, new technologies will continue to fail to support what
telemedicine effectively aims to promote -- collaboration and access to
distributed knowledge. Keywords: Communication; Cooperation; Medical work; Telemedicine | |||
| Evaluating the effectiveness of augmented reality displays for a manual assembly task | | BIBAK | Full-Text | 250-259 | |
| K. M. Baird; W. Barfield | |||
| The focus of this research was to examine how effectively augmented reality
displays, generated with a wearable computer, could be used for aiding an
operator performing a manual assembly task. Fifteen subjects were asked to
assemble a computer motherboard using four types of instructional media: paper
manual, computer-aided, opaque augmented reality display, and see-through
augmented reality display. The time of assembly and assembly errors were
measured for each type of instructional media, and a questionnaire focusing on
usability was administered to each subject at the end of each condition. The
results of the experiment indicated that the augmented reality conditions were
more effective instructional aids for the assembly task than either the paper
instruction manual or the computer-aided instruction. The see-through augmented
reality display resulted in the fastest assembly times, followed by the opaque
augmented reality display, the computer-aided instruction, and the paper
instructions respectively. In addition, subjects made fewer errors using the
augmented reality conditions compared to the computer-aided and paper
instructional media. However, while the two augmented reality conditions were a
more effective instructional media when time for assembly was the response
measure, there were still some important usability issues associated with the
augmented reality technology that were not present in the non-augmented reality
conditions. Keywords: Augmented reality; Manual assembly; Wearable computer | |||
| Peltier Haptic Interface (PHI) for improved sensation of touch in virtual environments | | BIBAK | Full-Text | 260-264 | |
| P. Sines; B. Das | |||
| In this report we describe an advanced virtual reality glove that we are
developing, called the Peltier Haptic Interface (PHI), which will provide
improved sensation of touch in virtual environments. PHI will provide
force/pressure feedback that can be varied independently on each finger, as
well as temperature sensation that can be varied non-uniformly over the whole
hand. The combination of these sensations will provide a more realistic sense
of touch and significantly increase the realism of virtual environments. PHI
will find extensive applications in biomedical simulations, teaching,
industrial line training, and many other areas. Keywords: Haptic; Peltier; Tactile; Virtual reality | |||
| ICOME: An Immersive Collaborative 3D Object Modelling Environment | | BIBAK | Full-Text | 265-274 | |
| C. Raymaekers; T. De Weyer; K. Coninx; F. Van Reeth; E. Flerackers | |||
| In most existing immersive virtual environments, 3D geometry is imported
from external packages. Within ICOME (an immersive Collaborative 3D Object
Modelling Environment) we focus on the immersive construction of 3D geometrical
objects within the environment itself. Moreover, the framework allows multiple
people to simultaneously undertake 3D modelling tasks in a collaborative way.
This article describes the overall architecture, which conceptually follows a
client/server approach. The various types of clients, which are implemented,
are described in detail. Some illustrative 3D object modelling examples are
given. Extensions to the system with regard to 3D audio are also mentioned. Keywords: Immersive modelling; Collaborative work; Virtual reality; Networking; User
interfacing | |||
| Virtual marketplaces: Building Management Information Systems for internet brokerage | | BIBAK | Full-Text | 275-290 | |
| T. Kollmann | |||
| On the World Wide Web (WWW), an increasing number of new trading forms for
brokerage of business transactions are emerging. Almost inevitably, central
contact-points on the WWW are being formed, so-called virtual marketplaces,
where supply and demand meet. The organisation they require is carried out by a
central operator, who offers his brokerage services on a business footing. The
aim of this paper is the generation of practical components of a Management
Information System (MIS) for such marketplaces that are only accessible online.
To do this, the theoretical assumptions of virtual marketplaces are combined
with a case study of a German internet-broker for used cars. Keywords: Internet-broker; Management Information System; Virtual marketplaces;
Virtual transactions | |||