Communicating graphical information to blind users using music: the role of context
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Implementation and evaluation of "just follow me": an immersive, VR-based, motion-training system
Presence: Teleoperators and Virtual Environments
Effects of Navigation and Position on Task When Presenting Diagrams
DIAGRAMS '02 Proceedings of the Second International Conference on Diagrammatic Representation and Inference
A Robotic Teacher of Chinese Handwriting
HAPTICS '02 Proceedings of the 10th Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems
Multisensory Perception: Beyond the Visual in Visualization
Computing in Science and Engineering
Real-Time Finite Element Modeling for Surgery Simulation: An Application to Virtual Suturing
IEEE Transactions on Visualization and Computer Graphics
The virtual haptic back for palpatory training
Proceedings of the 6th international conference on Multimodal interfaces
Annotating 3D electronic books
CHI '05 Extended Abstracts on Human Factors in Computing Systems
SIGGRAPH '05 ACM SIGGRAPH 2005 Emerging technologies
Tactical audio and acoustic rendering in biomedical applications
IEEE Transactions on Information Technology in Biomedicine
Hi-index | 0.00 |
This paper describes a novel guidance method for force exertion tasks with one contact point haptic interface using visual and auditory cues in a virtual environment. To teach how to exert force accurately, the proposed method displays a pre-recorded example force magnitude curve which user tries to follow. Orientation of manipulation can be presented as a visual cue or as an auditory cue through a mapping from 3D environment to a 2D plane to provide auditory guidance from the user's perspective. The proposed method was evaluated in a case of palpation of the aorta and proved to guide users to exert force more accurately than the traditional position-tracking approach.