Spatiotemporal Visuotactile Interaction

  • Authors:
  • Ju-Hwan Lee;Charles Spence

  • Affiliations:
  • Crossmodal Research Laboratory, University of Oxford, UK OX1 3UD;Crossmodal Research Laboratory, University of Oxford, UK OX1 3UD

  • Venue:
  • EuroHaptics '08 Proceedings of the 6th international conference on Haptics: Perception, Devices and Scenarios
  • Year:
  • 2008

Quantified Score

Hi-index 0.00

Visualization

Abstract

Over the last few years, a growing number of IT devices have started to incorporate touch-screen technology in order to create more effective multimodal user interfaces. The use of such technology has opened up the possibility of presenting different kinds of tactile feedback (i.e., active vs. passive) to users. Here, we report 2 experiments designed to investigate the spatiotemporal constraints on the multisensory interaction between vision and touch as they relate to a user's active vs. passive interaction with a touch screen device. Our results demonstrate that when touch is active, tactile perception is less influenced by irrelevant visual stimulation than when passively touching the screen. Our results also show that vision has to lead touch by approximately 40ms in order for optimal simultaneity to be perceived, no matter whether touch is active or passive. These findings provide constraints for the future design of enhanced multimodal interfaces.