Robust prediction of auditory step feedback for forward walking

  • Authors:
  • Markus Zank;Thomas Nescher;Andreas Kunz

  • Affiliations:
  • Innovation Center Virtual Reality (ICVR), Institute of Machine Tools and Manufacturing, ETH Zurich;Innovation Center Virtual Reality (ICVR), Institute of Machine Tools and Manufacturing, ETH Zurich;Innovation Center Virtual Reality (ICVR), Institute of Machine Tools and Manufacturing, ETH Zurich

  • Venue:
  • Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

Virtual reality systems supporting real walking as a navigation interface usually lack auditory step feedback, although this could give additional information to the user e.g. about the ground he is walking on. In order to add matching auditory step feedback to virtual environments, we propose a calibration-free and easy to use system that can predict the occurrence time of stepping sounds based on human gait data. Our system is based on the timing of reliably occurring characteristic events in the gait cycle which are detected using foot mounted accelerometers and gyroscopes. This approach not only allows us to detect but to predict the time of an upcoming step sound in realtime. Based on data gathered in an experiment, we compare different suitable events that allow a tradeoff between the maximum precision of the prediction and the maximum time by which the sound can be predicted.