Augmentation techniques for efficient exploration in head-mounted display environments
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
Evaluation of walking in place on a Wii balance board to explore a virtual environment
ACM Transactions on Applied Perception (TAP)
Redirected walking to explore virtual environments: Assessing the potential for spatial interference
ACM Transactions on Applied Perception (TAP)
Perceptually inspired methods for naturally navigating virtual worlds
SIGGRAPH Asia 2011 Courses
Scene-motion thresholds during head yaw for immersive virtual environments
ACM Transactions on Applied Perception (TAP)
Reorientation during body turns
JVRC'09 Proceedings of the 15th Joint virtual reality Eurographics conference on Virtual Environments
Touching floating objects in projection-based virtual reality environments
EGVE - JVRC'10 Proceedings of the 16th Eurographics conference on Virtual Environments & Second Joint Virtual Reality
Evaluation of surround-view and self-rotation in the OctaVis VR-System
JVRC '13 Proceedings of the 5th Joint Virtual Reality Conference
Towards enabling more effective locomotion in VR using a wheelchair-based motion platform
JVRC '13 Proceedings of the 5th Joint Virtual Reality Conference
Hi-index | 0.00 |
In immersive virtual environments (IVEs), users can control their virtual viewpoint by moving their tracked head and walking through the real world. Usually, movements in the real world are mapped one-to-one to virtual camera motions. With redirection techniques, the virtual camera is manipulated by applying gains to user motion so that the virtual world moves differently than the real world. Thus, users can walk through large-scale IVEs while physically remaining in a reasonably small workspace. In psychophysical experiments with a two-alternative forced-choice task, we have quantified how much humans can unknowingly be redirected on physical paths that are different from the visually perceived paths. We tested 12 subjects in three different experiments: (E1) discrimination between virtual and physical rotations, (E2) discrimination between virtual and physical straightforward movements, and (E3) discrimination of path curvature. In experiment E1, subjects performed rotations with different gains, and then had to choose whether the visually perceived rotation was smaller or greater than the physical rotation. In experiment E2, subjects chose whether the physical walk was shorter or longer than the visually perceived scaled travel distance. In experiment E3, subjects estimate the path curvature when walking a curved path in the real world while the visual display shows a straight path in the virtual world. Our results show that users can be turned physically about 49 percent more or 20 percent less than the perceived virtual rotation, distances can be downscaled by 14 percent and upscaled by 26 percent, and users can be redirected on a circular arc with a radius greater than 22 m while they believe that they are walking straight.