Trajectories based descriptor for dynamic events annotation

  • Authors:
  • Nicolas Ballas;Betrand Delezoide;Françoise Prêteux

  • Affiliations:
  • CEA, LIST, & CAOR MINES ParisTech, 92263 Fontenay-aux-Roses, France;CEA, LIST, 92263 Fontenay-aux-Roses, France;MINES ParisTech, 75272 Paris, France

  • Venue:
  • J-MRE '11 Proceedings of the 2011 joint ACM workshop on Modeling and representing events
  • Year:
  • 2011

Quantified Score

Hi-index 0.00

Visualization

Abstract

Video concepts annotation is a challenging problem having strong implication in many applications such as videos search and retrieval. In this work, we focus on the recognition of dynamics events (running, walking) in unconstrained videos. Feature trajectories have been shown to be an efficient video representation [23, 16, 17]. Trajectories are extracted by tracking interest points over several video frames to capture both motion and appearance information. We take advantage of this representation to characterize dynamic concepts in our events recognition system. At the trajectories extraction step, we investigate a new trajectory filtering scheme retaining only trajectories having a significant motions, helping the dynamic events recognition. We also propose two new trajectory-based descriptors. The first descriptor captures the trajectories motion through their first order statistics. The second descriptor studies the trajectories motion derivative to be invariant to uniform camera motion, such as a translation in a traveling scene. We evaluate our proposals on the HOHA dataset, a challenging dataset composed of videos extracted from Hollywood with significant camera motion.