Depth perception within peripersonal space using head-mounted display

  • Authors:
  • Abdeldjallil Naceri;Ryad Chellali

  • Affiliations:
  • University of Genova, 16126 Genoa, Italy and Italian Institute of Technology, 16163 Genoa, Italy;Italian Institute of Technology, 16163 Genoa, Italy

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

In this paper, we address depth perception in the peripersonal space within three virtual environments: poor environment (dark room), reduced cues environment (wireframe room), and rich cues environment (a lit textured room). Observers binocularly viewed virtual scenes through a head-mounted display and evaluated the egocentric distance to spheres using visually open-loop pointing tasks. We conducted two different experiments within all three virtual environments. The apparent size of the sphere was held constant in the first experiment and covaried with distance in the second one. The results of the first experiment revealed that observers more accurately estimated depth in the rich virtual environment compared to the visually poor and the wireframe environments. Specifically, observers' pointing errors were small in distances up to 55 cm, and increased with distance once the sphere was further than 55 cm. Individual differences were found in the second experiment. Our results suggest that the quality of virtual environments has an impact on distance estimation within reaching space. Also, manipulating the targets' size cue led to individual differences in depth judgments. Finally, our findings confirm the use of vergence as an absolute distance cue in virtual environments within the arm's reaching space.