Detail to attention: exploiting visual tasks for selective rendering
EGRW '03 Proceedings of the 14th Eurographics workshop on Rendering
Physically large displays improve path integration in 3D virtual navigation tasks
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Selective rendering using task-importance maps
APGV '04 Proceedings of the 1st Symposium on Applied perception in graphics and visualization
Towards Lean and Elegant Self-Motion Simulation in Virtual Reality
VR '05 Proceedings of the 2005 IEEE Conference 2005 on Virtual Reality
Physically large displays improve performance on spatial tasks
ACM Transactions on Computer-Human Interaction (TOCHI)
Cognitive factors can influence self-motion perception (vection) in virtual reality
ACM Transactions on Applied Perception (TAP)
Progressive perceptual audio rendering of complex scenes
Proceedings of the 2007 symposium on Interactive 3D graphics and games
Presence: Teleoperators and Virtual Environments
The Experience of Presence: Factor Analytic Insights
Presence: Teleoperators and Virtual Environments
Presence: Teleoperators and Virtual Environments
Sound representing self-motion in virtual environments enhances linear vection
Presence: Teleoperators and Virtual Environments
Proceedings of the 5th symposium on Applied perception in graphics and visualization
ACM Transactions on Applied Perception (TAP)
Sound representing self-motion in virtual environments enhances linear vection
Presence: Teleoperators and Virtual Environments
ACM Transactions on Applied Perception (TAP)
Spatialized sound influences biomechanical self-motion illusion ("vection")
Proceedings of the 7th Symposium on Applied Perception in Graphics and Visualization
Spatialized sound enhances biomechanically-induced self-motion illusion (vection)
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Bimodal task-facilitation in a virtual traffic scenario through spatialized sound rendering
ACM Transactions on Applied Perception (TAP)
Immersion with robots in large virtual environments
HRI '12 Proceedings of the seventh annual ACM/IEEE international conference on Human-Robot Interaction
Simulator sickness in mobile spatial sound spaces
CMMR/ICAD'09 Proceedings of the 6th international conference on Auditory Display
Parametric time-frequency representation of spatial sound in virtual worlds
ACM Transactions on Applied Perception (TAP)
Proceedings of the ACM Symposium on Applied Perception
Hi-index | 0.00 |
While rotating visual and auditory stimuli have long been known to elicit self-motion illusions (“circular vection”), audiovisual interactions have hardly been investigated. Here, two experiments investigated whether visually induced circular vection can be enhanced by concurrently rotating auditory cues that match visual landmarks (e.g., a fountain sound). Participants sat behind a curved projection screen displaying rotating panoramic renderings of a market place. Apart from a no-sound condition, headphone-based auditory stimuli consisted of mono sound, ambient sound, or low-/high-spatial resolution auralizations using generic head-related transfer functions (HRTFs). While merely adding nonrotating (mono or ambient) sound showed no effects, moving sound stimuli facilitated both vection and presence in the virtual environment. This spatialization benefit was maximal for a medium (20° × 15°) FOV, reduced for a larger (54° × 45°) FOV and unexpectedly absent for the smallest (10° × 7.5°) FOV. Increasing auralization spatial fidelity (from low, comparable to five-channel home theatre systems, to high, 5° resolution) provided no further benefit, suggesting a ceiling effect. In conclusion, both self-motion perception and presence can benefit from adding moving auditory stimuli. This has important implications both for multimodal cue integration theories and the applied challenge of building affordable yet effective motion simulators.