Controlling dynamic simulation with kinematic constraints
SIGGRAPH '87 Proceedings of the 14th annual conference on Computer graphics and interactive techniques
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
SIGGRAPH '95 Proceedings of the 22nd annual conference on Computer graphics and interactive techniques
The media equation: how people treat computers, television, and new media like real people and places
Retargetting motion to new characters
Proceedings of the 25th annual conference on Computer graphics and interactive techniques
The EMOTE model for effort and shape
Proceedings of the 27th annual conference on Computer graphics and interactive techniques
Motion capture-driven simulations that hit and react
Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on Computer animation
Real Time Responsive Animation with Personality
IEEE Transactions on Visualization and Computer Graphics
Personalised Real-Time Idle Motion Synthesis
PG '04 Proceedings of the Computer Graphics and Applications, 12th Pacific Conference
Knowing when to put your foot down
I3D '06 Proceedings of the 2006 symposium on Interactive 3D graphics and games
Interactive dynamic response for games
Proceedings of the 2007 ACM SIGGRAPH symposium on Video games
Gesture modeling and animation based on a probabilistic re-creation of speaker style
ACM Transactions on Graphics (TOG)
Elbows Higher! Performing, Observing and Correcting Exercises by a Virtual Trainer
IVA '08 Proceedings of the 8th international conference on Intelligent Virtual Agents
Temporal interaction between an artificial orchestra conductor and human musicians
Computers in Entertainment (CIE) - SPECIAL ISSUE: Media Arts (Part II)
Implementing expressive gesture synthesis for embodied conversational agents
GW'05 Proceedings of the 6th international conference on Gesture in Human-Computer Interaction and Simulation
Virtual body language: the history and future of avatars: How nonverbal expression is evolving on the internet
Hi-index | 0.00 |
In virtual human (VH) applications, and in particular, games, motions with different functions are to be synthesized, such as communicative and manipulative hand gestures, locomotion, expression of emotions or identity of the character. In the bodily behavior, the primary motions define the function, while the more subtle secondary motions contribute to the realism and variability. From a technological point of view, there are different methods at our disposal for motion synthesis: motion capture and retargeting, procedural kinematic animation, force-driven dynamical simulation, or the application of Perlin noise. Which method to use for generating primary and secondary motions, and how to gather the information needed to define them? In this paper we elaborate on informed usage, in its two meanings. First we discuss, based on our own ongoing work, how motion capture data can be used to identify joints involved in primary and secondary motions, and to provide basis for the specification of essential parameters for motion synthesis methods used to synthesize primary and secondary motion. Then we explore the possibility of using different methods for primary and secondary motion in parallel in such a way, that one methods informs the other. We introduce our mixed usage of kinematic an dynamic control of different body parts to animate a character in real-time. Finally we discuss motion Turing test as a methodology for evaluation of mixed motion paradigms.