Haptic interaction with 2D images

  • Authors:
  • Shahzad Rasool;Alexei Sourin

  • Affiliations:
  • Nanyang Technological University;Nanyang Technological University

  • Venue:
  • Proceedings of the 10th International Conference on Virtual Reality Continuum and Its Applications in Industry
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Visual and haptic rendering pipelines exist concurrently and compete for computing resources while the refresh rate of haptic rendering is two orders of magnitude higher than that of visual rendering (1000 Hz vs. 30-50Hz). However, in many cases, 3D visual rendering can be replaced by merely displaying 2D images, thus releasing the resources to image-driven haptic rendering algorithms. These algorithms provide for haptic texture rendering in vicinity of a touch point, but usually require additional information augmented with the image to provide for haptic perception of geometry of the shapes displayed in images. We propose a framework for making tangible images which allows haptic perception of three features: scene geometry, texture and physical properties. Haptic geometry rendering technique uses depth information, that could be acquired by a multitude of ways for providing haptic interaction with images and videos in real-time. The presented method neither performs 3D reconstruction nor requires for using polygonal models. It is based on direct force calculation and allows for smooth haptic interaction even at object boundaries. We also propose dynamic mapping of haptic workspace in real-time to enable sensation of fine surface details. Alternately, one of the existing shading-based haptic texture rendering methods can be combined with the proposed haptic geometry rendering algorithm to provide believable interaction. Haptic perception of physical properties is achieved by automatic segmentation of an image into haptic regions and interactive assignment of physical properties to them.