Surround-screen projection-based virtual reality: the design and implementation of the CAVE
SIGGRAPH '93 Proceedings of the 20th annual conference on Computer graphics and interactive techniques
User embodiment in collaborative virtual environments
CHI '95 Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
Object-focused interaction in collaborative virtual environments
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on human-computer interaction and collaborative virtual environments
The impact of eye gaze on communication using humanoid avatars
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
VRPN: a device-independent, network-transparent VR peripheral system
VRST '01 Proceedings of the ACM symposium on Virtual reality software and technology
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
I3D '03 Proceedings of the 2003 symposium on Interactive 3D graphics
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
blue-c: a spatially immersive display and 3D video portal for telepresence
ACM SIGGRAPH 2003 Papers
Supporting social human communication between distributed walk-in displays
Proceedings of the ACM symposium on Virtual reality software and technology
Successes and failures in co-present situations
Presence: Teleoperators and Virtual Environments - Special issue: Immersive projection technology
Comparison of head gaze and head and eye gaze within an immersive environment
DS-RT '06 Proceedings of the 10th IEEE international symposium on Distributed Simulation and Real-Time Applications
PCM'04 Proceedings of the 5th Pacific Rim conference on Advances in Multimedia Information Processing - Volume Part I
DS-RT '08 Proceedings of the 2008 12th IEEE/ACM International Symposium on Distributed Simulation and Real-Time Applications
Eye Tracking and Gaze Based Interaction within Immersive Virtual Environments
ICCS 2009 Proceedings of the 9th International Conference on Computational Science
DS-RT '11 Proceedings of the 2011 IEEE/ACM 15th International Symposium on Distributed Simulation and Real Time Applications
Conversational gaze mechanisms for humanlike robots
ACM Transactions on Interactive Intelligent Systems (TiiS)
Hi-index | 0.00 |
In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.