Autostereoscopic and Haptic Visualization for Space Exploration and Mission Design

  • Authors:
  • Cagatay Basdogan;Mitchell Lum;Jose Salcedo;Edward Chow;Stephen A. Kupiec;Andrew Kostrewski

  • Affiliations:
  • -;-;-;-;-;-

  • Venue:
  • HAPTICS '02 Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
  • Year:
  • 2002

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo displays and haptic devices have utilized polarized or shutter glasses for stereo vision (see, for example, Veldkamp et al., 2002, Chen et al., 2002, and Brederson et al., 2000). In this paper, we discuss the development stages and components of our set-up that allows a user to touch, feel, and manipulate virtual objects through a haptic device while seeing them in stereo without using any special eyewear. We also discuss the transformations involved in mapping the absolute coordinates of virtual objects into visual and haptic workspaces and the synchronization of cursor movements in these workspaces. Future applications of this work will include a) multi-modal visualization of planetary data and b) planning of space mission operations in virtual environments.