Thomas A. DeFanti, Principal Investigator
Daniel J. Sandin, Co-Principal Investigator
Robert V. Kenyon, Co-Principal Investigator

Electronic Visualization Laboratory
University of Illinois at Chicago
Chicago, IL.


Electronic Visualization Laboratory, University of Illinois at Chicago, 851 S. Morgan St., Chicago, IL. 60607-7053

(312) 996-3002

(312) 413-7585 fax





Virtual Environments.


Virtual Reality, Human-computer interface, Grand Challenges, Comptuational Science, Transfer of training, Scientific visualization, vBNS national network


The work herein described is a continuation of current NSF-supported efforts entitled "Prototyping and Quantitative Assessment of an Intuitive Virtual Reality Environment and its Application to Grand Challenges to Computational Science." The first two years of the three-year effort are now complete; the CAVE Virtual Reality Theater [DeFanti et. Al. 1993] has been built, applied to literally dozens of Grand (and National) Challenge projects [Cruz et. al. 1994; Roy et.al. (in press); Leigh et.al. 1994; Das et.al. 1994], and the quantitative assessment of its human-computer interface engineering is underway [Kenyon and Afenya, 1994; Ghazisaedy et al. accepted, Vasilakis 1994].

The prototyping of the actual CAVE hardware and software was very successful and was greatly accelerated by an NSF CISE Institutional Infrastructure award [FY'94-98] with additional funding by ARPA for deployment of the technology, including its interfacing to high-performance computing and communication technologies. There are three fully-operational standardized CAVE now: at the Electronic Visualization Laboratory (EVL) in Chicago, at the National Center for Supercomputing Applications (NCSA) in Champaign-Urbana, and at Argonne National Laboratory (ANL). A fourth CAVE is under construction at the ARPA Enterprise in Arlington, Virginia. Several other institutions and commercial manufacturing companies are partners in CAVE applications and technology transfer.

During the second year of the current grant, the CAVE became stable enough to start conducting quantitative experiments in the human-computer interface domain. It was clear from CAVE Grand Challenge work with scientists that understanding and insight were achievable through visual presentation, not surprising since virtual reality (VR) is an extension of scientific visualization in this effort; however, the literature in the field is discouraging regarding training transfer of tasks requiring hand-eye coordination, at least when using head-mounted displays [Kozak et. al. 1992]. If one is to posit that virtual reality will be used to train humans in critical performance tasks, the key becomes finding what sort of training is transferable between the virtual and physical worlds.

Kozak's experiments were adapted to the CAVE, and showed significant performance improvement in VR trained subjects compared to the untrained population for the same task. These positive transfer-of-training results support an early hypothesis that the CAVE's differing approach to VR (namely that one can always see one's hand in the scene) would give better results. New experiments are now proposed to quantitatively assess what may be perhaps the most important aspect of virtual reality research--the relationship between the technology and the training task which would result in a maximum transfer of training. Early results indicate that latency is a severe problem and that restricting VR solely to the visual domain limits its training application. This proposal addresses the quantitative assessment of this line of inquiry, including the testing procedures and the hardware/software improvements needed. The results will have broad applicability in the National Challenge areas, particularly in manufacturing and education.

It is proposed here to extend the study of latency measurement to wide area networks using CAVE-to-CAVE experiments over the vBNS national network as the model. Several corporate manufacturing partners are keenly interested in this technology for cooperative design and evaluation of complex products. EVL, NCSA and ANL will collaborate in this effort, with EVL being the technical lead as well as providing the graduate students for assessment and improvement of latency issues, creating software tools, and working with companies. The cost of the vBNS connection, associated equipment and local loop is the subject of an EVL MetaCenter Regional Alliance proposal now recommended for FY'94-96 funding.

This proposal asks for continued summer support for three research faculty and full-year research assistantships for four graduate students. Although many CAVE applications are supported through other grants now, this proposal is the only one that addresses the testing and evaluation of the human-computer interface and transfer of training. The quantitative assessment of complex VR systems is extremely important work, especially to validate the application of this technology to manufacturing. In addition, computer science and engineering students will gain much-needed skills in developing measurement strategies for real-time interactive systems


Kenyon, R.V. and Afenya, M. Training in real and virtual environments. Submitted: Annals of Biomed. Engin.

Kenyon, R., and Kneller, E. Effects of field-of-view on control of roll motion, IEEE Trans. Systems, Man and Cybern., Vol 23, 183-193, 1993.

Previc, F., Kenyon R., Boer, E., and Johnson, B. The effects of visual roll stimulation on postural and manual control and self-motion perception. Perception and Psychophysics, Vol 54, 93-107, 1993

Cruz-Neira, C., Sandin, D., DeFanti, T., Kenyon, R., and Hart, J. The CAVE Audio-Visual Environment. ACM Trans. on Graphics, Vol. 35, No. 6, pp. 65-72, 1992

Boer, E. and Kenyon, R. Identification of Time Varying Systems, IEEE Conf. on BioMed. Eng., Paris, 1992.


Virtual reality may best be defined as the wide-field presentation of computer-generated, multi-sensory information that tracks a user in real time. In addition to the more well known modes of virtual reality -- head-mounted displays and boom-mounted displays -- the Electronic Visualization Laboratory at the University of Illinois at Chicago recently introduced a third mode: a room constructed from large screens on that the graphics are projected on three walls and the floor.

The CAVE is a multi-person, room-sized, high-resolution, 3D video and audio environment. Graphics are rear projected in stereo onto three walls and the floor, and viewed with stereo glasses. As a viewer wearing a location sensor moves within its display boundaries, the correct perspective and stereo projections of the environment are updated, and the image moves with and surrounds the viewer. The other viewers in the CAVE are like passengers in a bus, along for the ride!

"CAVE," the name selected for the virtual reality theater, is both a recursive acronym (Cave Automatic Virtual Environment) and a reference to "The Simile of the Cave" found in Plato's "Republic," in that the philosopher compares one's perceptions of living in and outside a physical cave with two states of mind: illusion and belief.

The CAVE premiered at the ACM SIGGRAPH 92 conference. It is achieving national recognition as an excellent virtual reality prototype and a compelling display environment for computational science and engineering data.


Kenyon, R.V. and Afenya, M. Transfer-of-training in real and virtual environments. Presented at: First Annual Conference on Movement and Control in Man. Berkeley, Ca. June 14-15, 1994.

Kenyon R.V., and Kneller, E.W. Human performance and field-of-view. Soc. for Inform. Display Intern. Sympos., Vol 23, 290-293, 1992.

Ghazisaedy, M., Adamczyk, D., Sandin, D., Kenyon, R., and DeFanti, T. UltraSonic Calibration of a Magnetic Tracker in a Virtual Reality Space. Accepted: Conference on Virtual Environments, Dec. 1994.

Kenyon, R.V. and Young, L.R. MIT Canadian vestibular experiments on Spacelab-1 mission: 5. Postural responses following exposure to weightlessness. Exp. Brain Res. 64: 335--346, 1986

Kenyon, R.V. Effects of enhanced disparity on manual control, Final Report, AF Office of Scientific Research, 1981.


2. Speech and Natural Language Understanding. 3. Other Communication Modalities. 4. Adaptive Human Interfaces. 5. Usability and User-Centered Design. 6. Intelligent Interactive Systems for Persons with Disabilities.