Interactive spacetime control for animation
SIGGRAPH '92 Proceedings of the 19th annual conference on Computer graphics and interactive techniques
Hierarchical spacetime control
SIGGRAPH '94 Proceedings of the 21st annual conference on Computer graphics and interactive techniques
Fourier principles for emotion-based human figure animation
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Motion editing with spacetime constraints
Proceedings of the 1997 symposium on Interactive 3D graphics
Retargetting motion to new characters
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
A hierarchical approach to interactive motion editing for human-like figures
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
SIGGRAPH '88 Proceedings of the 15th annual conference on Computer graphics and interactive techniques
Comparing constraint-based motion editing methods
Graphical Models
Footskate cleanup for motion capture editing
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Feedback Control of Dynamic Systems
Feedback Control of Dynamic Systems
Motion texture: a two-level statistical model for character motion synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive control of avatars animated with human motion data
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Motion capture assisted animation: texturing and synthesis
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Practical parameterization of rotations using the exponential map
Journal of Graphics Tools
Style translation for human motion
ACM SIGGRAPH 2005 Papers
Hi-index | 0.00 |
In this paper, a novel motion editing tool, called the state feedback dynamic model, is proposed and demonstrated for the animators to edit the pre-existing motion capture data. The state feedback dynamic model is based on the linear time-invariant system (LTI). Compared with previous works, by this model, the animators need only modify a few keyframes manually, and the other frames can be adjusted automatically while preserving as much of the original quality as possible. It is a global modification on motion sequence. More important, the LTI model derives an explicit mapping between the high-dimensional motion capture data and low-dimensional hidden state variables. It transforms a number of possibly correlated joint angle variables into a smaller number of uncorrelated state variables. Then, the motion sequence is edited in state space, and which considers that the motion among joints is correlated. It is different from traditional methods which consider each joint as independent of each other. Finally, an effective algorithm is also developed to calculate the model parameters. Experimental results show that the generated animations through this method are natural and smooth.