Interactive generation of dancing animation with music synchronization

  • Authors:
  • Jianfeng Xu;Koichi Takagi;Shigeyuki Sakazawa

  • Affiliations:
  • KDDI R&D Laboratories Inc.;KDDI R&D Laboratories Inc.;KDDI R&D Laboratories Inc.

  • Venue:
  • SIGGRAPH Asia 2011 Posters
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

With the explosive growth of user-generated content (UGC) such as YouTube(R), computer animation that is synchronized with music has been in great demand as a new type of UGC. For example, it is popular to manually create such animation by a free software called MikuMikuDance in Japan. However, it usually takes much time to create a short piece of animation with requirement for necessary knowledge, which is a matter of considerable concern for UGC. On the other hand, several automatic systems are reported such as [Xu et al. 2011], where dancing animation is synchronized with the input music by rhythm and intensity features. However, the automatically generated animation cannot reflect a user intention to see a specific dance motion (called performance motion) for a specific part of music, thus user's unique aesthetic appreciation is not satisfied, which is essential for UGC.