Multimodal floor for immersive environments

  • Authors:
  • Alvin W. Law;Jessica W. Ip;Benjamin V. Peck;Yon Visell;Paul G. Kry;Jeremy R. Cooperstock

  • Affiliations:
  • McGill University, Montreal, Canada;McGill University, Montreal, Canada;McGill University, Montreal, Canada;McGill University, Montreal, Canada;McGill University, Montreal, Canada;McGill University, Montreal, Canada

  • Venue:
  • ACM SIGGRAPH 2009 Emerging Technologies
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We have developed an interactive system that allows untethered users to experience walking on virtual ground surfaces resembling natural materials. The demonstration consists of a multimodal floor interface for providing auditory, tactile and visual feedback to users' steps. It is intended for immersive virtual and augmented reality environments (VE) that provide the impression of walking over natural ground surfaces, such as snow and ice. To date, immersive environments with interactive floor surfaces have been largely focused on visual and auditory feedback linked to a VE simulation (e.g., [Gronbaek 2007]; see also the comparative review in [Miranda and Wanderley 2006]). However, while walking in natural environments, we receive continuous, multisensory information about the nature of the ground we walk on -- the crush of dry leaves, the soft compression of grass. The static nature of floor surfaces in existing VEs typically bears little resemblance to a given natural ground material. This creates a perceptual conflict with the dynamic visual and/or auditory feedback that users are provided in the VE. This project illustrates a novel approach to reconciling such perceptual conflicts, based on multisensory feedback provided through a floor surface in response to users' steps.