Using an event-based approach to improve the multimodal rendering of 6DOF virtual contact
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
Multimodal Design for Enactive Toys
Computer Music Modeling and Retrieval. Sense of Sounds
Gesture Recognition with Hidden Markov Models to Enable Multi-modal Haptic Feedback
EuroHaptics '08 Proceedings of the 6th international conference on Haptics: Perception, Devices and Scenarios
Preliminary experiment combining virtual reality haptic shoes and audio synthesis
EuroHaptics'10 Proceedings of the 2010 international conference on Haptics - generating and perceiving tangible sensations: Part II
Influence of auditory and haptic feedback on a balancing task
International Journal of Autonomous and Adaptive Communications Systems
Hi-index | 0.00 |
This paper presents a multimodal rendering architecture that integrates physically based sound models with haptic and visual rendering. The proposed sound modeling approach is compared to other existing techniques. An example of implementation of the architecture is presented, that realizes bimodal (auditory and haptic) rendering of contact stiffness. It is shown that the proposed rendering scheme allows tight synchronization of the two modalities, as well as a high degree of interactivity and responsiveness of the sound models to gestures and actions of a user. Finally, an experiment on the relative contributions of haptic and auditory information to bimodal judgments of contact stiffness is presented. Experimental results support the effectiveness of auditory feedback in modulating haptic perception of stiffness. Copyright © 2006 John Wiley & Sons, Ltd.