Feature Detection with Automatic Scale Selection
International Journal of Computer Vision
Evaluation of Interest Point Detectors
International Journal of Computer Vision - Special issue on a special section on visual surveillance
ICCV '03 Proceedings of the Ninth IEEE International Conference on Computer Vision - Volume 2
Velocity Adaptation of Space-Time Interest Points
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 1 - Volume 01
Recognizing Human Actions: A Local SVM Approach
ICPR '04 Proceedings of the Pattern Recognition, 17th International Conference on (ICPR'04) Volume 3 - Volume 03
International Journal of Computer Vision
A survey of advances in vision-based human motion capture and analysis
Computer Vision and Image Understanding - Special issue on modeling people: Vision-based understanding of a person's shape, appearance, movement, and behaviour
Behavior recognition via sparse spatio-temporal features
ICCCN '05 Proceedings of the 14th International Conference on Computer Communications and Networks
A survey on visual surveillance of object motion and behaviors
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Human action recognition under log-euclidean riemannian metric
ACCV'09 Proceedings of the 9th Asian conference on Computer Vision - Volume Part I
Hi-index | 0.00 |
Local space-time features can be used to detect and characterize motion events in video. Such features are valid for recognizing motion patterns, by defining a vocabulary of primitive features, and representing each video sequence by means of a histogram, in terms of such vocabulary. In this paper, we propose a supervised vocabulary computation technique which is based on the prior classification of the training events into classes, where each class corresponds to a human action. We will compare the performance of our method with the global approach to show that not only does our method obtain better results but it is also computationally less expensive.