Fast texture synthesis using tree-structured vector quantization
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Image quilting for texture synthesis and transfer
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Texture synthesis over arbitrary manifold surfaces
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Real-time texture synthesis by patch-based sampling
ACM Transactions on Graphics (TOG)
Synthesis of bidirectional texture functions on arbitrary surfaces
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Image based flow visualization
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Texture and Shape Synthesis on Surfaces
Proceedings of the 12th Eurographics Workshop on Rendering Techniques
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
Texture Synthesis by Non-Parametric Sampling
ICCV '99 Proceedings of the International Conference on Computer Vision-Volume 2 - Volume 2
Texture Synthesis for 3D Shape Representation
IEEE Transactions on Visualization and Computer Graphics
Efficient Example-Based Painting and Synthesis of 2D Directional Texture
IEEE Transactions on Visualization and Computer Graphics
Directional enhancement in texture-based vector field visualization
Proceedings of the 4th international conference on Computer graphics and interactive techniques in Australasia and Southeast Asia
Hi-index | 0.00 |
We present a novel patch-based algorithm for synthesizing a moving 2D texture, i.e. a sequence of frame-coherent 2D textures. In our method, the input are a sample texture and a 2D flow field. We first synthesize a 2D directional texture according to the direction information of the flow field and then let the texture move following the flow. Iteratively, the texture Ti+1 of the (i + 1)-th frame is first obtained by moving forward the texture Ti in a piecewise manner. Then necessary hole-filling and blending is used to make Ti+1 coherent with Ti. In addition, to maintain good visual quality throughout the sequence of textures, best-matching patches from the sample texture are used at selected locations of Ti+1 to prevent cumulative blurring due to blending. Our test examples show that our method is capable of generating high quality moving textures with attractive visual effects that are useful for flow visualization.