Taking steps: the influence of a walking technique on presence in virtual reality
ACM Transactions on Computer-Human Interaction (TOCHI) - Special issue on virtual reality software and technology
Walking walking-in-place flying, in virtual environments
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Effect of turning strategy on maneuvering ability using the treadport locomotion interface
Presence: Teleoperators and Virtual Environments
Spatial Orientation in Virtual Environments: Background Considerations and Experiments
Spatial Cognition, An Interdisciplinary Approach to Representing and Processing Spatial Knowledge
Updating orientation in large virtual environments using scaled translational gain
APGV '06 Proceedings of the 3rd symposium on Applied perception in graphics and visualization
Virtual Locomotion: Walking in Place through Virtual Environments
Presence: Teleoperators and Virtual Environments
LLCM-WIP: Low-Latency, Continuous-Motion Walking-in-Place
3DUI '08 Proceedings of the 2008 IEEE Symposium on 3D User Interfaces
Do we need to walk for effective virtual reality navigation? physical rotations alone may suffice
SC'10 Proceedings of the 7th international conference on Spatial cognition
Walking improves your cognitive map in environments that are large-scale and large in extent
ACM Transactions on Computer-Human Interaction (TOCHI)
Evaluation of walking in place on a Wii balance board to explore a virtual environment
ACM Transactions on Applied Perception (TAP)
GUD WIP: Gait-Understanding-Driven Walking-In-Place
VR '10 Proceedings of the 2010 IEEE Virtual Reality Conference
Hi-index | 0.00 |
In this work, we present a simple method of "walking in place" (WIP) using the Microsoft Kinect to explore a virtual environment (VE) with a head-mounted display (HMD). Other studies have shown that WIP to explore a VE is equivalent to normal walking in terms of spatial orientation. This suggests that WIP is a promising way to explore a large VE. The Microsoft Kinect sensor is a great tool for implementing WIP because it enables real time skeletal tracking and is relatively inexpensive (150 USD). However, the skeletal information obtained from Kinect sensors can be noisy. Thus, in this work, we discuss how we combined the data from two Kinects to implement a robust WIP algorithm. As part of our analysis on how best to implement WIP with the Kinect, we compare Gaze direction locomotion to Torso direction locomotion. We report that participants' spatial orientation was better when they translated forward in the VE in the direction they were looking.