Interactive partner control in close interactions for real-time applications

  • Authors:
  • Edmond S. L. Ho;Jacky C. P. Chan;Taku Komura;Howard Leung

  • Affiliations:
  • Hong Kong Baptist University, Kowloon, Hong Kong;City University of Hong Kong, Kowloon, Hong Kong;University of Edinburgh, UK;City University of Hong Kong, Kowloon, Hong Kong

  • Venue:
  • ACM Transactions on Multimedia Computing, Communications, and Applications (TOMCCAP)
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

This article presents a new framework for synthesizing motion of a virtual character in response to the actions performed by a user-controlled character in real time. In particular, the proposed method can handle scenes in which the characters are closely interacting with each other such as those in partner dancing and fighting. In such interactions, coordinating the virtual characters with the human player automatically is extremely difficult because the system has to predict the intention of the player character. In addition, the style variations from different users affect the accuracy in recognizing the movements of the player character when determining the responses of the virtual character. To solve these problems, our framework makes use of the spatial relationship-based representation of the body parts called interaction mesh, which has been proven effective for motion adaptation. The method is computationally efficient, enabling real-time character control for interactive applications. We demonstrate its effectiveness and versatility in synthesizing a wide variety of motions with close interactions.