Compressing polygon mesh geometry with parallelogram prediction
Proceedings of the conference on Visualization '02
Segmenting motion capture data into distinct behaviors
GI '04 Proceedings of the 2004 Graphics Interface Conference
Automated extraction and parameterization of motions in large data sets
ACM SIGGRAPH 2004 Papers
Efficient content-based retrieval of motion capture data
ACM SIGGRAPH 2005 Papers
An efficient search algorithm for motion data using weighted PCA
Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on Computer animation
Compression of motion capture databases
ACM SIGGRAPH 2006 Papers
Adapting wavelet compression to human motion capture clips
GI '07 Proceedings of Graphics Interface 2007
Indexing large human-motion databases
VLDB '04 Proceedings of the Thirtieth international conference on Very large data bases - Volume 30
Perceptually consistent example-based human motion retrieval
Proceedings of the 2009 symposium on Interactive 3D graphics and games
Content-based retrieval for human motion data
Journal of Visual Communication and Image Representation
Fast local and global similarity searches in large motion capture databases
Proceedings of the 2010 ACM SIGGRAPH/Eurographics Symposium on Computer Animation
Motion retrieval based on kinetic features in large motion database
Proceedings of the 14th ACM international conference on Multimodal interaction
Hi-index | 0.00 |
Due to the popularity of motion capture data in many applications, such as games, movies and virtual environments, huge collections of motion capture data are now available. It is becoming important to store these data in compressed form while being able to retrieve them without much overhead. However, there is little work that addresses both issues together. In this paper, we address these two issues by proposing a novel database architecture. First, we propose a lossless compression algorithm to compress the motion clips, which is based on a novel Alpha Parallelogram Predictor (APP) to estimate the degree of freedom (DOF) of each child joint from its immediate neighbors and parents that have already been processed. Second, we propose to store selected eigenvalues and eigenvectors of each motion clip, which only require a very small amount of memory overheads, for faster filtering of irrelevant motions. Based on this architecture, real-time queries become a three-step process. In the first two steps, we perform a quick filtering to identify relevant motion clips in the database through a two-level indexing structure. In the third step, only a small number of candidate clips are uncompressed and accurately matched with a Dynamic Time Warping algorithm. Our results show that users can efficiently search clips from this losslessly compressed motion database.