The AHI: an audio and haptic interface for contact interactions
UIST '00 Proceedings of the 13th annual ACM symposium on User interface software and technology
Project FEELEX: adding haptic surface to graphics
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Synthesizing sounds from physically based motion
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
A practical model for subsurface light transport
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Using an event-based approach to improve the multimodal rendering of 6DOF virtual contact
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
EuroHaptics '08 Proceedings of the 6th international conference on Haptics: Perception, Devices and Scenarios
An activity classification for vibrotactile phenomena
HAID'06 Proceedings of the First international conference on Haptic and Audio Interaction Design
Hi-index | 0.00 |
In this paper, we propose a method for the synthesis of haptic and auditory senses that is based on a physical model called AudioHaptics. We have developed a haptic environment that incorporates auditory sensation. We achieved this by fitting a speaker at the end effecter of a haptic interface. The FEM (Finite Element method) was used to calculate the vibration of a virtual object when an impact is occurred, and the sound pressure data at the speaker position was then calculated based on the 2D complex amplitude of the object surface in real time. The AudioHaptics system can generate sounds originating from virtual objects, which can have arbitrary shapes, attributes and inner structures. Experiments for evaluation with real users demonstrated that this method is effective for rendering audio and haptic sensation.