Performance-based control interface for character animation

  • Authors:
  • Satoru Ishigaki;Timothy White;Victor B. Zordan;C. Karen Liu

  • Affiliations:
  • Georgia Tech;Georgia Tech;UC Riverside;Georgia Tech

  • Venue:
  • ACM SIGGRAPH 2009 papers
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user's intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual world's designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the prerecorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user's personal style.