WYSIWYF Display: A Visual/Haptic Interface to Virtual Environment

  • Authors:
  • Yasuyoshi Yokokohji;Ralph L. Hollis;Takeo Kanade

  • Affiliations:
  • Department of Mechanical Engineering, Kyoto University, Kyoto 606-8501, JAPAN, http://www.cs.cmu.edu/∼msl/virtual_desc.html, yokokoji@mech.kyoto-u.ac.jp;The Robotics Institute, Carnegie Mellon University, 5000 Forbes Ave., Pittsburgh, PA 15213;The Robotics Institute, Carnegie Mellon University, 5000 Forbes Ave., Pittsburgh, PA 15213

  • Venue:
  • Presence: Teleoperators and Virtual Environments
  • Year:
  • 1999

Quantified Score

Hi-index 0.00

Visualization

Abstract

To build a VR training system for visuomotor skills, an image displayed by a visual interface should be correctly registered to a haptic interface so that the visual sensation and the haptic sensation are both spatially and temporally consistent. In other words, it is desirable that what you see is what you feel (WYSIWYF). In this paper, we propose a method that can realize correct visual/haptic registration, namely WYSIWYF, by using a vision-based, object-tracking technique and a video-keying technique. Combining an encountered-type haptic device with a motion-command-type haptic rendering algorithm makes it possible to deal with two extreme cases (free motion and rigid constraint). This approach provides realistic haptic sensations, such as free-to-touch and move-and-collide. We describe a first prototype and illustrate its use with several demonstrations. The user encounters the haptic device exactly when his or her hand reaches a virtual object in the display. Although this prototype has some remaining technical problems to be solved, it serves well to show the validity of the proposed approach.