The RADIANCE lighting simulation and rendering system
SIGGRAPH '94 Proceedings of the 21st annual conference on Computer graphics and interactive techniques
A Model of Saliency-Based Visual Attention for Rapid Scene Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
A perceptually based physical error metric for realistic image synthesis
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Putting social sciences together again: an introduction to the volume
Dynamics in human and primate societies
Measuring and predicting visual fidelity
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Perception-guided global illumination solution for animation rendering
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Spatiotemporal sensitivity and visual attention for efficient rendering of dynamic environments
ACM Transactions on Graphics (TOG)
Perceptually-Driven Simplification for Interactive Rendering
Proceedings of the 12th Eurographics Workshop on Rendering Techniques
Detail to attention: exploiting visual tasks for selective rendering
EGRW '03 Proceedings of the 14th Eurographics workshop on Rendering
TPCG '04 Proceedings of the Theory and Practice of Computer Graphics 2004 (TPCG'04)
The influence of sound effects on the perceived smoothness of rendered animations
APGV '05 Proceedings of the 2nd symposium on Applied perception in graphics and visualization
Selective rendering: computing only what you see
Proceedings of the 4th international conference on Computer graphics and interactive techniques in Australasia and Southeast Asia
Parallel selective rendering of high-fidelity virtual environments
Parallel Computing
Perceptually guided high-fidelity rendering exploiting movement bias in visual attention
ACM Transactions on Applied Perception (TAP)
Levels of realism: from virtual reality to real virtuality
Proceedings of the 24th Spring Conference on Computer Graphics
Investigation of the beat rate effect on frame rate for animated content
Proceedings of the 25th Spring Conference on Computer Graphics
Cross-modal affects of smell on the real-time rendering of grass
Proceedings of the 25th Spring Conference on Computer Graphics
Saliency in motion: selective rendering of dynamic virtual environments
Proceedings of the 25th Spring Conference on Computer Graphics
Bimodal task-facilitation in a virtual traffic scenario through spatialized sound rendering
ACM Transactions on Applied Perception (TAP)
Acoustic Rendering and Auditory–Visual Cross-Modal Perception and Interaction
Computer Graphics Forum
EG PGV'06 Proceedings of the 6th Eurographics conference on Parallel Graphics and Visualization
The Visual Computer: International Journal of Computer Graphics
Hi-index | 0.00 |
The developers and users of real-time graphics, such as games and virtual reality, are demanding ever more realistic computer generated images. Despite the availability of modern graphics hardware, such real-time high fidelity graphics is still not feasible on a single PC. Research on visual perception has shown that the perceived quality of rendered graphics depends not only on the fidelity of the generated imagery but also on the characteristics of visual attention and the limitations of the human visual system. The findings of this research have been used to define perceptually driven criteria for rendering with the aim to reduce rendering times. Furthermore, in reality there are strong crossmodal interactions between auditory and visual stimuli, with a number of studies showing that stimuli reaching the various senses are not, in general, processed independently. In this paper we investigate whether auditory stimuli, and more specifically sound effects with abrupt onsets, affect a viewer's perceived quality of rendered images while watching computer generated animations. In fact, we show how we can potentially accelerate the rendering of animations by directing the viewer's attention towards the source of a sound and selectively render at high quality only the sound emitting object. For this purpose a renderer was implemented which selectively renders the sound emitting objects and the surrounding pixels to high quality while the rest of the scene is rendered at a significantly lower quality. A psychophysical experiment with 120 participants was run which revealed a significant effect of sound effects on the perceived rendering quality. Our results show that audio stimuli, and in particular sound effects, can be exploited when rendering animations, to significantly reduce rendering time without any loss in the user's perception of delivered quality.