Java in a nutshell: a desktop quick reference for Java programmers
Java in a nutshell: a desktop quick reference for Java programmers
Using HTML 4 Java 1.1 and JavaScript 1.2: Platinum Edition
Using HTML 4 Java 1.1 and JavaScript 1.2: Platinum Edition
Identity formation, learning styles and trust in virtual worlds
ACM SIGMIS Database
A run-time programmable simulator to enable multi-modal interaction with rigid-body systems
Interacting with Computers
Comparison of the levels of presence and anxiety in an acrophobic environment viewed via hmd or cave
Presence: Teleoperators and Virtual Environments
Sound design and perception in walking interactions
International Journal of Human-Computer Studies
Bimodal perception of audio-visual material properties for virtual environments
ACM Transactions on Applied Perception (TAP)
Evaluating environmental sounds from a presence perspective for virtual reality applications
EURASIP Journal on Audio, Speech, and Music Processing - Special issue on environmental sound synthesis, processing, and retrieval
Bimodal task-facilitation in a virtual traffic scenario through spatialized sound rendering
ACM Transactions on Applied Perception (TAP)
CHINZ '02 Proceedings of the SIGCHI-NZ Symposium on Computer-Human Interaction
Assessing the quality of sensory experience for multimedia presentations
Image Communication
The effects of audio on depth perception in S3D games
Proceedings of the 7th Audio Mostly Conference: A Conference on Interaction with Sound
Journal of Signal Processing Systems
Hi-index | 0.00 |
The quality of realism in virtual environments (VEs) is typically considered to be a function of visual and audio fidelity mutually exclusive of each other. However, the VE participant, being human, is multimodal by nature. Therefore, in order to validate more accurately the levels of auditory and visual fidelity that are required in a virtual environment, a better understanding is needed of the intersensory or crossmodal effects between the auditory and visual sense modalities. To identify whether any pertinent auditory-visual cross-modal perception phenomena exist, 108 subjects participated in three experiments which were completely automated using HTML, Java, and JavaScript programming languages. Visual and auditory display quality perceptions were measured intraand intermodally by manipulating the pixel resolution of the visual display and Gaussian white noise level, and by manipulating the sampling frequency of the auditory display and Gaussian white noise level. Statistically significant results indicate that high-quality auditory displays coupled with highquality visual displays increase the quality perception of the visual displays relative to the evaluation of the visual display alone, and that low-quality auditory displays coupled with high-quality visual displays decrease the quality perception of the auditory displays relative to the evaluation of the auditory display alone. These findings strongly suggest that the quality of realism in VEs must be a function of both auditory and visual display fidelities inclusive of each other.