Goal-directed, dynamic animation of human walking
SIGGRAPH '89 Proceedings of the 16th annual conference on Computer graphics and interactive techniques
A global human walking model with real-time kinematic personification
The Visual Computer: International Journal of Computer Graphics - Special issue on computer animation 1989/90
Limit cycle control and its application to the animation of balancing and walking
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Improv: a system for scripting interactive actors in virtual worlds
SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer graphics and interactive techniques
Interactive control for physically-based animation
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
I3D '01 Proceedings of the 2001 symposium on Interactive 3D graphics
Proceedings of the 28th annual conference on Computer graphics and interactive techniques
Computer puppetry: An importance-based approach
ACM Transactions on Graphics (TOG)
On-line locomotion generation based on motion blending
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interactive control of avatars animated with human motion data
Proceedings of the 29th annual conference on Computer graphics and interactive techniques
Interpolation Synthesis of Articulated Figure Motion
IEEE Computer Graphics and Applications
Verbs and Adverbs: Multidimensional Motion Interpolation
IEEE Computer Graphics and Applications
A sketching interface for articulated figure animation
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
FootSee: an interactive animation system
Proceedings of the 2003 ACM SIGGRAPH/Eurographics symposium on Computer animation
Layered acting for character animation
ACM SIGGRAPH 2003 Papers
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Motion doodles: an interface for sketching character motion
ACM SIGGRAPH 2004 Papers
Freeform user interfaces for graphical computing
SG'03 Proceedings of the 3rd international conference on Smart graphics
MotionMaster: authoring and choreographing Kung-fu motions by sketch drawings
Proceedings of the 2006 ACM SIGGRAPH/Eurographics symposium on Computer animation
Pen-to-mime: Pen-based interactive control of a human figure
Computers and Graphics
Acquiring and pointing: an empirical study of pen-tilt-based interaction
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
Haptic puppetry for interactive games
Edutainment'06 Proceedings of the First international conference on Technologies for E-Learning and Digital Entertainment
Natural use profiles for the pen: an empirical exploration of pressure, tilt, and azimuth
Proceedings of the SIGCHI Conference on Human Factors in Computing Systems
FingerWalking: motion editing with contact-based hand performance
EUROSCA'12 Proceedings of the 11th ACM SIGGRAPH / Eurographics conference on Computer Animation
Finger walking: motion editing with contact-based hand performance
Proceedings of the ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Hi-index | 0.00 |
This paper presents a pen-based intuitive interface to control a virtual human figure interactively. Recent commercial pen devices can detect not only the pen positions but also the pressure and tilt of the pen. We utilize such information to make a human figure perform various types of motions in response to the pen movements manipulated by the user. A figure walks, runs, turns and steps along the trajectory and speed of the pen. The figure also bends, stretches and tilts in response to the tilt of the pen. Moreover, it ducks and jumps in response to the pen pressure. Using this interface, the user controls a virtual human figure intuitively as if he or she were holding a virtual puppet and playing with it. In addition to the interface design, this paper describes a motion generation engine to produce various motions based on the parameters that are given by the pen interface. We take a motion blending approach and construct motion blending modules with a set of small number of motion capture data for each type of motions. Finally, we discuss about the effectiveness and limitations of the interface based on some preliminary experiments.