Influence of vision and haptics on plausibility of social interaction in virtual reality scenarios

  • Authors:
  • Zheng Wang;Ji Lu;Angelika Peer;Martin Buss

  • Affiliations:
  • Institute of Automatic Control Engineering, Technische Universität München, Munich, Germany;Institute of Automatic Control Engineering, Technische Universität München, Munich, Germany;Institute of Automatic Control Engineering, Technische Universität München, Munich, Germany;Institute of Automatic Control Engineering, Technische Universität München, Munich, Germany

  • Venue:
  • EuroHaptics'10 Proceedings of the 2010 international conference on Haptics - generating and perceiving tangible sensations: Part II
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper focuses on the effects of visual and haptic feedback on the experienced plausibility of social interaction in a virtual reality scenario, where participants were asked to perform handshakes with a virtual, visually and haptically rendered partner. A 3D virtual environment was created and integrated with a handshaking robot, enabling the participant to see the virtual partner while shaking hands. To assess the effect of visual and haptic rendering strategies on plausibility, an experiment with human subjects was carried out. The results indicate that adding vision and improving the quality of haptics, both improve plausibility. Similar effect sizes further suggest that vision and haptics are equally important to the perceived plausibility of a virtual handshaking task.