SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
Retargetting motion to new characters
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
A hierarchical approach to interactive motion editing for human-like figures
Proceedings of the 26th annual conference on Computer graphics and interactive techniques
Computer puppetry: An importance-based approach
ACM Transactions on Graphics (TOG)
Layered acting for character animation
ACM SIGGRAPH 2003 Papers
Style translation for human motion
ACM SIGGRAPH 2005 Papers
Real-time motion retargeting to highly varied user-created morphologies
ACM SIGGRAPH 2008 papers
Accelerometer-based user interfaces for the control of a physically simulated character
ACM SIGGRAPH Asia 2008 papers
ACM SIGGRAPH 2009 papers
Performance-based control interface for character animation
ACM SIGGRAPH 2009 papers
Animating non-humanoid characters with human motion data
Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Realtime human motion control with a small number of inertial sensors
I3D '11 Symposium on Interactive 3D Graphics and Games
Realtime performance-based facial animation
ACM SIGGRAPH 2011 papers
Artist friendly facial animation retargeting
Proceedings of the 2011 SIGGRAPH Asia Conference
ACM Transactions on Graphics (TOG) - SIGGRAPH 2012 Conference Proceedings
Real-time human pose recognition in parts from single depth images
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Accurate realtime full-body motion capture using a single depth camera
ACM Transactions on Graphics (TOG) - Proceedings of ACM SIGGRAPH Asia 2012
KinÊtre: animating the world with the human body
Proceedings of the 25th annual ACM symposium on User interface software and technology
FingerWalking: motion editing with contact-based hand performance
EUROSCA'12 Proceedings of the 11th ACM SIGGRAPH / Eurographics conference on Computer Animation
Hi-index | 0.00 |
We present a novel real-time motion puppetry system that drives the motion of non-human characters using human motion input. We aim to control a variety of creatures whose body structures and motion patterns can differ greatly from a human's. A combination of direct feature mapping and motion coupling enables the generation of natural creature motion, along with intuitive and expressive control for puppetry. First, in the design phase, direct feature mappings and motion classification can be efficiently and intuitively computed given crude motion mimicking as input. Later, during the puppetry phase, the user's body motions are used to control the target character in real-time, using the combination of feature mappings generated from the design phase. We demonstrate the effectiveness of our approach with several examples of natural puppetry, where a variety of non-human creatures are controlled in real-time using human motion input from a commodity motion sensing device.