Quynh's
Den
|
Evaluating
the
Importance
of
Multi-Sensory
Input
on
Learning
and
the
Sense
of
Presence
in
Virtual
Environments
|
To date, attempts to promote presence in virtual reality have focused on improvements in visual realism and addition of haptic and auditory cues. At Georgia Tech, we have created a virtual environment that provides the user with four sensory modalities - visual, auditory, tactile, and olfactory. We are now conducting experiments with 288 subjects to examine how different combinations of sensory cues affect the users' sense of presence and their memory of their experience in the virtual environment. By systematically varying the types and amounts of different sensory inputs we will be able to build a preliminary model of how information from different modalities affect the users' experience in the virtual environment.
The motivation behind using additional auditory, tactile, and olfactory cues comes from the suggestion that access to a broader range of sensory cues promotes a greater sense of presence (Fontaine, 1992). Recent work from Hendrix and Barfield (1996) has shown that the addition of spatialized sound significantly increases presence. In an environment with spatialized sound, we should be able to identify the direction of a sound source, thus externalizing the sound to a position in the virtual space. The idea of an olfactory sensory channel for virtual environments is also a recent concept. Barfield and Danas (1996) presented the notion of olfactory parameters such as smells, smell intensity, smelling points, field of smell, and spatial resolution that are analogous to visual parameters. In implementing our multi-sensory input environment, we have taken some of these parameters into consideration. In particular, we have attempted to include spatial resolution of olfactory and auditory cues and smelling points.
The theme of our virtual environment is the corporate office. We have included a reception area, a bathroom, an employee office, a copier room, an executive office, and a balcony. There are two levels for each modality - two levels of visual detail, absence or presence of spatialized sounds, absence or presence of tactile cues, and absence or presence of smells. The high visual detail includes the use of lighting and high resolution texture maps. Flat-shading and low resolution texture maps are used for low visual detail. The computational gain at the low visual detail level is evidenced by an approximately 100% increase in the frame rate. We have clamped the frame rate in the low visual detail version to be approximately equal to the average frame rate of the high visual detail version to prevent this difference from becoming a factor in the subjects' sense of presence.
A script portraying a real-estate agent is played for all subjects in the environment. The additional spatial sounds include the sound of a fan in the reception area, a flushing toilet as the user passes the bathroom door, the sound of a copier, and city noise on the balcony. The sound of the copier is varied depending on how close the subject is to the copier room. The two tactile cues are provided by a fan and a heat lamp. A real fan in the physical world switches on as the subject passes the fan in the reception room, and the heat lamp switches on when the user steps onto the balcony into the bright sunlight. The smell of fresh coffee and flowers are the olfactory cues, delivered to the user through a small oxygen mask connected to the smell sources and small pumps. To prevent mixing of smells, we have distributed them so that the coffee scent occurs in the reception area where there is a coffee machine, and the flower scent occurs in the executive office where there is a vase of lilies on the conference table. This distribution of smells should create a spatial sense of smell for the user, enabling the user to associate each scent with a location in the virtual space. In order to assure that all users are subjected to the cues at the same time and for an identical duration, we have automated the navigation through the office environment and the switching on/off of all tactile and olfactory sensory inputs.
According to Fontaine (1992), greater sense of presence is associated with recall of more details of an encounter. Hence, questionnaires given to subjects after their immersion were designed to test their recall of objects in the environment. A visual cue is associated with each auditory, tactile, and olfactory cue. For example, the users can see the fan in the reception area as well as feel and hear it. The results of the survey will reveal whether or not the additional cues were necessary and which combinations of cues result in optimal recall and sense of presence.
In the initial pilot test of six subjects, we found users attempting to interact with the tactile cue and identifying scents that were appropriate to the office theme. At least two subjects indicated that they smelled white-out or paint. One subject searched for the boundaries of the fan air flow by stooping down and maneuvering from side to side. This supports the conclusion advocated by Hendrix and Barfield (1996) that presence in a virtual environment is highly correlated with the sense of realism, and that realism is not just visual but also includes the level of fidelity to which the user interacts with the environment.
Current work on this project is the continuation of subject tests and analysis of results. The two levels for each of the four modalities results in 16 different possible permutations. 18 subjects will be tested for each combination, and a between-subject analysis will be conducted.