A hierarchical approach to interactive motion editing for human-like figures
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive control of avatars animated with human motion data
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Automated Derivation of Primitives for Movement Classification
Autonomous Robots
Flexible automatic motion blending with registration curves
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
Motion synthesis from annotations
ACM SIGGRAPH 2003 Papers
Segmenting motion capture data into distinct behaviors
GI '04 Proceedings of the 2004 Graphics Interface Conference
Automated extraction and parameterization of motions in large data sets
ACM SIGGRAPH 2004 Papers
Efficient content-based retrieval of motion capture data
ACM SIGGRAPH 2005 Papers
Motion modeling for on-line locomotion synthesis
Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation
Near-optimal character animation with continuous control
ACM SIGGRAPH 2007 papers
Proceedings of the 2008 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Hi-index | 0.00 |
We present a new method for identifying a set of movement types from unlabelled human motion data. One typical approach first segments input motion into a series of intervals, and then clusters those into a set of groups. Unfortunately, the dependency between segmentation and clustering causes trouble in alternate tuning of parameters. Instead, we unify those two tasks in a single optimization framework that searches for the optimal segmentation maximizing the quality of clustering. The genetic algorithm is employed to address this combinatorial problem with our own genetic representation and fitness function. As the primary benefit, the user is able to obtain a repertoir of major movements just by selecting the number of classses to be identified. We demonstrate the usefulness of our approach by providing visual descriptions of motion data, and an intuitive animation authoring interface based on movement collections.