Dynamic eye convergence for head-mounted displays improves user performance in virtual environments

  • Authors:
  • Andrei Sherstyuk;Arindam Dey;Christian Sandor;Andrei State

  • Affiliations:
  • University of Hawaii;University of South Australia;University of South Australia;InnerOptic Technology Inc. and University of North Carolina at Chapel Hill

  • Venue:
  • I3D '12 Proceedings of the ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

In Virtual Environments (VE), users are often facing tasks that involve direct manipulation of virtual objects at close distances, such as touching, grabbing, placement. In immersive systems that employ head-mounted displays these tasks could be quite challenging, due to lack of convergence of virtual cameras. We present a mechanism that dynamically converges left and right cameras on target objects in VE. This mechanism simulates the natural process that takes place in real life automatically. As a result, the rendering system maintains optimal conditions for stereoscopic viewing of target objects at varying depths, in real time. Building on our previous work, which introduced the eye convergence algorithm [Sherstyuk and State 2010], we developed a Virtual Reality (VR) system and conducted an experimental study on effects of eye convergence in immersive VE. This paper gives the full description of the system, the study design and a detailed analysis of the results obtained.