SIGGRAPH '86 Proceedings of the 13th annual conference on Computer graphics and interactive techniques
SIGGRAPH '86 Proceedings of the 13th annual conference on Computer graphics and interactive techniques
Fourier Synthesis of Ocean Scenes
IEEE Computer Graphics and Applications
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Fast texture synthesis using tree-structured vector quantization
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Practical animation of liquids
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Motion texture: a two-level statistical model for character motion synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Flow-based video synthesis and editing
ACM SIGGRAPH 2004 Papers
Dynamic Shape and Appearance Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Translation-invariant denoising using multiwavelets
IEEE Transactions on Signal Processing
Hi-index | 0.00 |
The synthesis for dynamic flowing water is of relatively high practical value in design of virtual reality, computer games, digital movies and scientific computing etc. On one hand, the physical model cannot help people to produce the photorealistic and easy edited flowing scene; on the other hand, digital products can be used to show the flowing scene in the world easily. This paper presents a novel algorithm for synthesizing dynamic water scene based on a sample video. To obtain video textons, we analyze the sample video automatically using dynamic textures model. Then we utilize linear dynamic system (LDS) to represent the characteristic of each texton. By further hard constrains we synthesize a new video for dynamic water flow which is prolonged and non-fuzzy in vision. We provide test examples to demonstrate the effective and efficiency of our proposed method.