Trajectory-Based grasp interaction for virtual environments

  • Authors:
  • Zhenhua Zhu;Shuming Gao;Huagen Wan;Wenzhen Yang

  • Affiliations:
  • State Key Lab of CAD&CG, Zhejiang University, Hangzhou, P.R. China;State Key Lab of CAD&CG, Zhejiang University, Hangzhou, P.R. China;State Key Lab of CAD&CG, Zhejiang University, Hangzhou, P.R. China;State Key Lab of CAD&CG, Zhejiang University, Hangzhou, P.R. China

  • Venue:
  • CGI'06 Proceedings of the 24th international conference on Advances in Computer Graphics
  • Year:
  • 2006

Quantified Score

Hi-index 0.10

Visualization

Abstract

Natural grasp interaction plays an important role in enhancing users' immersion experience in virtual environments. However, visually distracting artifacts such as the interpenetration of the hand and the grasped objects are always accompanied during grasp interaction due to a simplified whole-hand collision model, discrete control data used for detecting collisions and the interference of device noises. In addition, complicated distribution of forces from multi-finger contacts makes the natural grasp and manipulation of a virtual object difficult. In order to solve these problems, this paper presents a novel approach for grasp interaction in virtual environments. Based on the research in Neurophysiology, we first construct finger's grasp trajectories and detect collisions between the objects and the trajectories instead of the whole-hand collision model, then deduce the grasp configuration using collision detection results, and finally compute feedback forces according to grasp identification conditions. Our approach has been verified in a CAVE-based virtual environment.