Trajectory-based representation of human actions

  • Authors:
  • Antonios Oikonomopoulos;Ioannis Patras;Maja Pantic;Nikos Paragios

  • Affiliations:
  • Imperial College London, London, UK;Department of Electronic Engineering, Queen Mary University, London, UK;Imperial College London, London, UK;Ecole Centrale de Paris, Chatenay-Malabry, France

  • Venue:
  • ICMI'06/IJCAI'07 Proceedings of the ICMI 2006 and IJCAI 2007 international conference on Artifical intelligence for human computing
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This work addresses the problem of human action recognition by introducing a representation of a human action as a collection of short trajectories that are extracted in areas of the scene with significant amount of visual activity. The trajectories are extracted by an auxiliary particle filtering tracking scheme that is initialized at points that are considered salient both in space and time. The spatiotemporal salient points are detected by measuring the variations in the information content of pixel neighborhoods in space and time. We implement an online background estimation algorithm in order to deal with inadequate localization of the salient points on the moving parts in the scene, and to improve the overall performance of the particle filter tracking scheme.We use a variant of the Longest Common Subsequence algorithm (LCSS) in order to compare different sets of trajectories corresponding to different actions. We use Relevance Vector Machines (RVM) in order to address the classification problem. We propose new kernels for use by the RVM, which are specifically tailored to the proposed representation of short trajectories. The basis of these kernels is the modified LCSS distance of the previous step. We present results on real image sequences from a small database depicting people performing 12 aerobic exercises.