Perceptual sensitivity to head tracking latency in virtual environments with varying degrees of scene complexity

  • Authors:
  • Katerina Mania;Bernard D. Adelstein;Stephen R. Ellis;Michael I. Hill

  • Affiliations:
  • University of Sussex, UK;NASA Ames Research Center, USA;NASA Ames Research Center, USA;NASA Ames Research Center, USA

  • Venue:
  • APGV '04 Proceedings of the 1st Symposium on Applied perception in graphics and visualization
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

System latency (time delay) and its visible consequences are fundamental virtual environment (VE) deficiencies that can hamper user perception and performance. The aim of this research is to quantify the role of VE scene content and resultant relative object motion on perceptual sensitivity to VE latency. Latency detection was examined by presenting observers in a head-tracked, stereoscopic head mounted display with environments having differing levels of complexity ranging from simple geometrical objects to a radiosity-rendered scene of two interconnected rooms. Latency discrimination was compared with results from a previous study in which only simple geometrical objects, without radiosity rendering or a 'real-world' setting, were used. From the results of these two studies, it can be inferred that the Just Noticeable Difference (JND) for latency discrimination by trained observers averages ~15 ms or less, independent of scene complexity and real-world meaning. Such knowledge will help elucidate latency perception mechanisms and, in turn, guide VE designers in the development of latency countermeasures.