International Journal of Computer Vision
Recognizing Human Actions: A Local SVM Approach
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
Dynamic Texture Recognition by Spatio-Temporal Multiresolution Histograms
WACV-MOTION '05 Proceedings of the IEEE Workshop on Motion and Video Computing (WACV/MOTION'05) - Volume 2 - Volume 02
International Journal of Computer Vision
Dynamic Texture Recognition Using Local Binary Patterns with an Application to Facial Expressions
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Comparison of Wavelet Based Spatio-temporal Decomposition Methods for Dynamic Texture Recognition
IbPRIA '09 Proceedings of the 4th Iberian Conference on Pattern Recognition and Image Analysis
DynTex: A comprehensive database of dynamic textures
Pattern Recognition Letters
Modeling temporal structure of decomposable motion segments for activity classification
ECCV'10 Proceedings of the 11th European conference on Computer vision: Part II
Dynamic texture recognition using volume local binary patterns
WDV'05/WDV'06/ICCV'05/ECCV'06 Proceedings of the 2005/2006 international conference on Dynamical vision
Hi-index | 0.00 |
In this paper, we propose a novel dynamic texture description method base on spatiotemporal context phrase for general dynamic texture. Different with the existing methods, we consider the spatiotemporal context both in the feature extraction phase and in the feature description phase. We present a space time constraint and salience rank strategies to extract the representative interest points. Then, we propose a novel space time context phrase method to mining and describe the semantic and spatiotemporal correlation of interest points. Finally, the space time context phrase is used in the nearest neighbor classifier to classify dynamic texture scene. We test our algorithm on the dynamic texture classification and human action classification tasks on the Dyntex dataset and the KTH dataset, respectively. The results show that our proposed method outperforms the state-of-the-art methods on the tasks.