Individualized reaction movements for virtual humans
Proceedings of the 4th international conference on Computer graphics and interactive techniques in Australasia and Southeast Asia
Motion-capture-based avatar control framework in third-person view virtual environments
Proceedings of the 2006 ACM SIGCHI international conference on Advances in computer entertainment technology
Interactive dynamic response for games
Proceedings of the 2007 ACM SIGGRAPH symposium on Video games
Simulating competitive interactions using singly captured motions
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
Proceedings of the 2007 ACM symposium on Virtual reality software and technology
Psychologically Inspired Anticipation and Dynamic Response for Impacts to the Head and Upper Body
IEEE Transactions on Visualization and Computer Graphics
Simulation of individual spontaneous reactive behavior
Proceedings of the 7th international joint conference on Autonomous agents and multiagent systems - Volume 1
Animating responsive characters with dynamic constraints in near-unactuated coordinates
ACM SIGGRAPH Asia 2008 papers
ACM SIGGRAPH 2009 papers
Perception based real-time dynamic adaptation of human motions
MIG'10 Proceedings of the Third international conference on Motion in games
A mobile environment for sketching-based skeleton generation
World Wide Web
Push it real: perceiving causality in virtual interactions
ACM Transactions on Graphics (TOG) - SIGGRAPH 2012 Conference Proceedings
Hi-index | 0.00 |
Interactive generation of reactive motions for virtual humans as they are hit, pushed and pulled are very important to many applications, such as computer games. In this paper, we propose a new method to simulate reactive motions during arbitrary bipedal activities, such as standing, walking or running. It is based on momentum based inverse kinematics and motion blending. When generating the animation, the user first imports the primary motion to which the perturbation is to be applied to. According to the condition of the impact, the system selects a reactive motion from the database of pre-captured stepping and reactive motions. It then blends the selected motion into the primary motion using momentum-based inverse kinematics. Since the reactive motions can be edited in real-time, the criteria for motion search can be much relaxed than previous methods, and therefore, the computational cost for motion search can be reduced. Using our method, it is possible to generate reactive motions by applying external perturbations to the characters at arbitrary moment while they are performing some actions. Copyright © 2005 John Wiley & Sons, Ltd.