Feel the "fabric": an audio-haptic interface

  • Authors:
  • G. Huang;D. Metaxas;M. Govindaraj

  • Affiliations:
  • University of Pennsylvania, Philadelphia, PA;Rutgers The State University of New Jersey, Piscataway, NJ;Philadelphia University, Philadelphia, PA

  • Venue:
  • Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
  • Year:
  • 2003

Quantified Score

Hi-index 0.00

Visualization

Abstract

An objective fabric modeling system should convey not only the visual but also the haptic and audio sensory feedbacks to remote/internet users via an audio-haptic interface. In this paper we develop a fabric surface property modeling system consisting of a stylus based fabric characteristic sound modeling, and an audio-haptic interface. By using a stylus, people can perceive fabrics surface roughness, friction, and softness though not as precisely as with their bare fingers. The audio-haptic interface is intended to simulate the case of "feeling a virtually fixed fabric via a rigid stylus" by using the PHANToM haptic interface. We develop a DFFT based correlation-restoration method to model the surface roughness and friction coefficient of a fabric, and a physically based method to model the sound of a fabric when rubbed by a stylus. The audio-haptic interface, which renders synchronized auditory and haptic stimuli when the virtual stylus rubs on the surface of a virtual fabric, is developed in VC++6.0 by using OpenGL and the PHANToM GHOST SDK. We asked subjects to test our audio-haptic interface and they were able to differentiate the surface properties of virtual fabrics in the correct order. We show that the virtual fabric is a good modeling of the real counterpart.