Non-Photorealistic Rendering
The lit sphere: a model for capturing NPR shading from art
GRIN'01 No description on Graphics interface 2001
Using Texture Synthesis for Non-Photorealistic Shading from Paint Samples
PG '03 Proceedings of the 11th Pacific Conference on Computer Graphics and Applications
X-toon: an extended toon shader
Proceedings of the 4th international symposium on Non-photorealistic animation and rendering
Tweakable light and shade for cartoon animation
Proceedings of the 4th international symposium on Non-photorealistic animation and rendering
Illustrative rendering in Team Fortress 2
Proceedings of the 5th international symposium on Non-photorealistic animation and rendering
Locally controllable stylized shading
ACM SIGGRAPH 2007 papers
Stylized Rendering Using Samples of a Painted Image
IEEE Transactions on Visualization and Computer Graphics
Interactive on-surface signal deformation
ACM SIGGRAPH 2010 papers
Dynamic stylized shading primitives
Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Non-Photorealistic Animation and Rendering
Sketch and paint-based interface for highlight modeling
SBM'08 Proceedings of the Fifth Eurographics conference on Sketch-Based Interfaces and Modeling
Hi-index | 0.00 |
The Lit-Sphere model proposed by Sloan et al. (Proceedings of Graphics Interface 2001, pp. 143---150, 2001) is a method for emulating expressive artistic shading styles for 3D scenes. Assuming that artistic shading styles are described by view space normals, this model produces a variety of stylized shading scenes beyond traditional 3D lighting control. However, it is limited to the static lighting case: the shading effect is only dependent on the camera view. In addition, it cannot support small-scale brush stroke styles. In this paper, we propose a scheme to extend the Lit-Sphere model based on light space normals rather than view space normals. Owing to the light space representation, our shading model addresses the issues of the original Lit-Sphere approach, and allows artists to use a light source to obtain dynamic diffuse and specular shading. Then the shading appearance can be refined using stylization effects including highlight shape control, sub-lighting effects, and brush stroke styles. Our algorithms are easy to implement on GPU, so that our system allows interactive shading design.