Multiple-contact representation for the real-time volume haptic rendering of a non-rigid object

  • Authors:
  • Sang-Youn Kim;Jinah Park;Dong-Soo Kwon

  • Affiliations:
  • Telerobotics & Control Laboratory, Dept. of Mechanical Engineering, KAIST;Computer Graphics & Visualization Laboratory, School of Engineering, ICU;Telerobotics & Control Laboratory, Dept. of Mechanical Engineering, KAIST

  • Venue:
  • HAPTICS'04 Proceedings of the 12th international conference on Haptic interfaces for virtual environment and teleoperator systems
  • Year:
  • 2004

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper presents a fast haptic rendering method providing the sense of touch from a virtual volumetric non-rigid object when a human operator interacts with the object at multiple points. Previously, we have proposed a fast volume haptic rendering method based on the shape-retaining chain linked model (or the S-chain model) that can handle the deformation of a volumetric non-rigid object and its haptic feedback in real time. One of the key differences between the S-chain model and a traditional FEM or mass-spring model is that the computation of the deformation and its reflected force is performed at a local level. When there are more than one interaction points with the object, it is necessary to consider a modeling framework that can handle human operator's all inputs together. In this paper, we propose a modeling framework in which forces generated at interaction points are vectorially summed to deal with the multiple contact points. Our experiments demonstrate that our proposed method is suitable for the real-time volume haptic rendering of a volumetric non-rigid object with multiple-contact points.