Third-Person Navigation of Whole-Planet Terrain in a Head-tracked Stereoscopic Environment

  • Authors:
  • Zachary Wartell;William Ribarsky;Larry Hodges

  • Affiliations:
  • -;-;-

  • Venue:
  • VR '99 Proceedings of the IEEE Virtual Reality
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

Navigation and interaction in virtual environments that use stereoscopic head-tracked displays and have very large data sets present several challenges beyond those encountered with smaller data sets and simpler displays. First, zooming by approaching or retreating from a target must be augmented by integrating scale as a seventh degree of freedom. Second, in order to maintain good stereoscopic imagery, the interface must: maintain stereo image pairs that the user perceives as a single 3D image, minimize loss of perceived depth since stereoscopic imagery cannot properly occlude the screen's frame, provide maximum depth information, and place objects at distances where they are best manipulated. Finally, the navigation interface must work when the environment is displayed at any scale. This paper addresses these problems for god's-eye-view or third person navigation of a specific large-scale virtual environment: a high-resolution terrain database covering an entire planet.