A Virtual Reality-Based Framework for Experiments on Perception of Manual Gestures

  • Authors:
  • Sebastian Ullrich;Jakob T. Valvoda;Marc Wolter;Gisela Fehrmann;Isa Werth;Ludwig Jaeger;Torsten Kuhlen

  • Affiliations:
  • Virtual Reality Group, RWTH Aachen University, Germany;Virtual Reality Group, RWTH Aachen University, Germany;Virtual Reality Group, RWTH Aachen University, Germany;Deaf & Sign Language Research Team Aachen, RWTH Aachen University, Germany;Deaf & Sign Language Research Team Aachen, RWTH Aachen University, Germany;Deaf & Sign Language Research Team Aachen, RWTH Aachen University, Germany;Virtual Reality Group, RWTH Aachen University, Germany

  • Venue:
  • Gesture-Based Human-Computer Interaction and Simulation
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work contributes an integrated and flexible approach to sign language processing in virtual environments that allows for interactive experimental evaluations with high ecological validity. Initial steps deal with real-time tracking and processing of manual gestures. Motion data is stereoscopically rendered in immersive virtual environments with varying spatial and representational configurations. Besides flexibility, the most important aspect is the seamless integration within a VR-based neuropsychological experiment software. Ongoing studies facilitated with this system contribute to the understanding of the cognition of sign language. The system is beneficial for experimenters because of the controlled and immersive three-dimensional environment enabling experiments with visual depth perception that can not be achieved with video presentations.