Shake-your-head: revisiting walking-in-place for desktop virtual reality
Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology
Evaluation of walking in place on a Wii balance board to explore a virtual environment
ACM Transactions on Applied Perception (TAP)
Full body acting rehearsal in a networked virtual environment-a case study
Presence: Teleoperators and Virtual Environments
EGVE - JVRC'11 Proceedings of the 17th Eurographics conference on Virtual Environments & Third Joint Virtual Reality
Torso versus gaze direction to navigate a VE by walking in place
Proceedings of the ACM Symposium on Applied Perception
Robust prediction of auditory step feedback for forward walking
Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
Hi-index | 0.00 |
Many Virtual Environments require walking interfaces to explore virtual worlds much larger than available real-world tracked space. We present a model for generating virtual locomotion speeds from Walking-In-Place (WIP) inputs based on walking biomechanics. By employing gait principles, our model - called Gait-Understanding-Driven Walking-In-Place (GUD WIP) - creates output speeds which better match those evident in Real Walking, and which better respond to variations in step frequency, including realistic starting and stopping. The speeds output by our implementation demonstrate considerably less within-step fluctuation than a good current WIP system - Low-Latency, Continuous-Motion (LLCM) WIP - while still remaining responsive to changes in user input. We compared resulting speeds from Real Walking, GUD WIP, and LLCM-WIP via user study: The average output speeds for Real Walking and GUD WIP respond consistently with changing step frequency - LLCM-WIP is far less consistent. GUD WIP produces output speeds that are more locally consistent (smooth) and step-frequency-to-walk-speed consistent than LLCM-WIP.