Tactual Displays for Wearable Computing
ISWC '97 Proceedings of the 1st IEEE International Symposium on Wearable Computers
A First Investigation into the Effectiveness of Tactons
WHC '05 Proceedings of the First Joint Eurohaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Crossmodal icons for information display
CHI '06 Extended Abstracts on Human Factors in Computing Systems
Tactile feedback for mobile interactions
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Playing with tactile feedback latency in touchscreen interaction: two approaches
INTERACT'11 Proceedings of the 13th IFIP TC 13 international conference on Human-computer interaction - Volume Part II
“Spindex” (Speech Index) Enhances Menus on Touch Screen Devices with Tapping, Wheeling, and Flicking
ACM Transactions on Computer-Human Interaction (TOCHI)
Hi-index | 0.00 |
Over the last few years, a growing number of IT devices have started to incorporate touch-screen technology in order to create more effective multimodal user interfaces. The use of such technology has opened up the possibility of presenting different kinds of tactile feedback (i.e., active vs. passive) to users. Here, we report 2 experiments designed to investigate the spatiotemporal constraints on the multisensory interaction between vision and touch as they relate to a user's active vs. passive interaction with a touch screen device. Our results demonstrate that when touch is active, tactile perception is less influenced by irrelevant visual stimulation than when passively touching the screen. Our results also show that vision has to lead touch by approximately 40ms in order for optimal simultaneity to be perceived, no matter whether touch is active or passive. These findings provide constraints for the future design of enhanced multimodal interfaces.