Motion synthesis for synchronizing with streaming music by segment-based search on metadata motion graphs

  • Authors:
  • Jianfeng Xu;Koichi Takagi;Shigeyuki Sakazawa

  • Affiliations:
  • Media Solutions Laboratory, KDDI R&D Laboratories Inc., Japan;Media Solutions Laboratory, KDDI R&D Laboratories Inc., Japan;Media Solutions Laboratory, KDDI R&D Laboratories Inc., Japan

  • Venue:
  • ICME '11 Proceedings of the 2011 IEEE International Conference on Multimedia and Expo
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Music and dance are two major forms of entertainment in our daily life. Moreover, the fact that people dance to music suggests the possibility of synchronizing human motion with music. In this paper, we present a novel system to automatically synthesize human motion that is synchronized with streaming music using both rhythm and intensity features. In our system, a motion capture database is re-organized into a novel graph-based representation with metadata (called metadata motion graphs) beforehand, which is specially designed for the streaming application. When receiving a certain amount of music data as a segment, our system will search a best path for the segment on a metadata motion graph. This approach, whose effectiveness is demonstrated in a user study, can compose motions segment by segment, which (1) are synchronized with the music at a beat level in a short enough period, (2) are connected seamlessly with the previous segment, and (3) have the necessary synchronization capacity for the remaining music no matter how long it is.