Intuitive and Interactive Modification of Large Finite Element Models
VIS '04 Proceedings of the conference on Visualization '04
Presence: Teleoperators and Virtual Environments
Hi-index | 0.00 |
We have developed a multi-modal virtual environment set-up by fusing visual and haptic images through the use of a new autostereoscopic display and a force-feedback haptic device. Most of the earlier visualization systems that integrate stereo displays and haptic devices have utilized polarized or shutter glasses for stereo vision (see, for example, Veldkamp et al., 2002, Chen et al., 2002, and Brederson et al., 2000). In this paper, we discuss the development stages and components of our set-up that allows a user to touch, feel, and manipulate virtual objects through a haptic device while seeing them in stereo without using any special eyewear. We also discuss the transformations involved in mapping the absolute coordinates of virtual objects into visual and haptic workspaces and the synchronization of cursor movements in these workspaces. Future applications of this work will include a) multi-modal visualization of planetary data and b) planning of space mission operations in virtual environments.