Interactive generation of dancing animation with music synchronization
SIGGRAPH Asia 2011 Posters
Authoring rules for bodily interaction: from example clips to continuous motions
IVA'12 Proceedings of the 12th international conference on Intelligent Virtual Agents
Gesture synthesis adapted to speech emphasis
Speech Communication
Hi-index | 0.00 |
Music and dance are two major forms of entertainment in our daily life. Moreover, the fact that people dance to music suggests the possibility of synchronizing human motion with music. In this paper, we present a novel system to automatically synthesize human motion that is synchronized with streaming music using both rhythm and intensity features. In our system, a motion capture database is re-organized into a novel graph-based representation with metadata (called metadata motion graphs) beforehand, which is specially designed for the streaming application. When receiving a certain amount of music data as a segment, our system will search a best path for the segment on a metadata motion graph. This approach, whose effectiveness is demonstrated in a user study, can compose motions segment by segment, which (1) are synchronized with the music at a beat level in a short enough period, (2) are connected seamlessly with the previous segment, and (3) have the necessary synchronization capacity for the remaining music no matter how long it is.