On-line locomotion generation based on motion blending
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive control of avatars animated with human motion data
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Verbs and Adverbs: Multidimensional Motion Interpolation
IEEE Computer Graphics and Applications
Planning biped locomotion using motion capture data and probabilistic roadmaps
ACM Transactions on Graphics (TOG)
Motion synthesis from annotations
ACM SIGGRAPH 2003 Papers
PCA-Based Walking Engine Using Motion Capture Data
CGI '04 Proceedings of the Computer Graphics International
On-line motion blending for real-time locomotion generation: Research Articles
Computer Animation and Virtual Worlds - Special Issue: The Very Best Papers from CASA 2004
Motion modeling for on-line locomotion synthesis
Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation
Fat graphs: constructing an interactive character with continuous controls
Proceedings of the 2006 ACM SIGGRAPH/Eurographics symposium on Computer animation
Near-optimal character animation with continuous control
ACM SIGGRAPH 2007 papers
Construction and optimal search of interpolated motion graphs
ACM SIGGRAPH 2007 papers
A steering model for on-line locomotion synthesis
Computer Animation and Virtual Worlds - CASA 2007
Efficient and robust annotation of motion capture data
Proceedings of the 2009 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Hi-index | 0.00 |
This paper presents an automatic turns detection and annotation technique which works from unlabeled captured locomotion. Motion annotation is required by several motion capture editing techniques. Detection of turns is made difficult because of the oscillatory nature of the human locomotion. Our contribution is to address this problem by analyzing the trajectory of the center of mass of the human body into a velocity-curvature space representation. Our approach is based on experimental observations of carefully captured human motions. We demonstrate the efficiency and the accuracy of our approach.