Eye gaze in virtual environments: evaluating the need and initial work on implementation

  • Authors:
  • Norman Murray;Dave Roberts;Anthony Steed;Paul Sharkey;Paul Dickerson;John Rae;Robin Wolff

  • Affiliations:
  • Centre for Virtual Environments, University of Salford, Salford M5 4WT, U.K.;Centre for Virtual Environments, University of Salford, Salford M5 4WT, U.K.;Department of Computer Science, University College London, Gower Street, London WC1E 6BT, U.K.;Interactive Systems Research Group, Department of Cybernetics, University of Reading, RG6 6UR, U.K.;School of Human & Life Sciences, Roehampton University, London SW15 5PU, U.K.;School of Human & Life Sciences, Roehampton University, London SW15 5PU, U.K.;Centre for Virtual Environments, University of Salford, Salford M5 4WT, U.K.

  • Venue:
  • Concurrency and Computation: Practice & Experience - Distributed Simulation, Virtual Environments and Real-time Applications
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

For efficient collaboration between participants, eye gaze is seen as being critical for interaction. Video conferencing either does not attempt to support eye gaze (e.g. AcessGrid) or only approximates it in round table conditions (e.g. life size telepresence). Immersive collaborative virtual environments represent remote participants through avatars that follow their tracked movements. By additionally tracking people's eyes and representing their movement on their avatars, the line of gaze can be faithfully reproduced, as opposed to approximated. This paper presents the results of initial work that tested if the focus of gaze could be more accurately gauged if tracked eye movement was added to that of the head of an avatar observed in an immersive VE. An experiment was conducted to assess the difference between user's abilities to judge what objects an avatar is looking at with only head movements being displayed, while the eyes remained static, and with eye gaze and head movement information being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects correctly identifying what a person is looking at in an immersive virtual environment. This is followed by a description of the work that is now being undertaken following the positive results from the experiment. We discuss the integration of an eye tracker more suitable for immersive mobile use and the software and techniques that were developed to integrate the user's real-world eye movements into calibrated eye gaze in an immersive virtual world. This is to be used in the creation of an immersive collaborative virtual environment supporting eye gaze and its ongoing experiments. Copyright © 2009 John Wiley & Sons, Ltd.