Activity recognition using dynamic subspace angles

  • Authors:
  • Binlong Li;M. Ayazoglu;T. Mao;O. I. Camps;M. Sznaier

  • Affiliations:
  • Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA;Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA;Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA;Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA;Dept. of Electr. & Comput. Eng., Northeastern Univ., Boston, MA, USA

  • Venue:
  • CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Cameras are ubiquitous everywhere and hold the promise of significantly changing the way we live and interact with our environment. Human activity recognition is central to understanding dynamic scenes for applications ranging from security surveillance, to assisted living for the elderly, to video gaming without controllers. Most current approaches to solve this problem are based in the use of local temporal-spatial features that limit their ability to recognize long and complex actions. In this paper, we propose a new approach to exploit the temporal information encoded in the data. The main idea is to model activities as the output of unknown dynamic systems evolving from unknown initial conditions. Under this framework, we show that activity videos can be compared by computing the principal angles between subspaces representing activity types which are found by a simple SVD of the experimental data. The proposed approach outperforms state-of-the-art methods classifying activities in the KTH dataset as well as in much more complex scenarios involving interacting actors.