Multi-agent event recognition by preservation of spatiotemporal relationships between probabilistic models

  • Authors:
  • S. Khokhar;I. Saleemi;M. Shah

  • Affiliations:
  • -;-;-

  • Venue:
  • Image and Vision Computing
  • Year:
  • 2013

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a new method for multi-agent activity analysis and recognition that uses low level motion features and exploits the inherent structure and recurrence of motion present in multi-agent activity scenarios. Our representation is inspired by the need to circumvent the difficult problem of tracking in multi-agent scenarios and the observation that for many visual multi-agent recognition tasks, the spatiotemporal description of events irrespective of agent identity is sufficient for activity classification. We begin by learning generative models describing motion induced by individual actors or groups, which are considered to be agents. These models are Gaussian mixture distributions learned by linking clusters of optical flow to obtain contiguous regions of locally coherent motion. These possibly overlapping regions or segments, known as motion patterns are then used to analyze a scene by estimating their spatial and temporal relationships. The geometric transformations between two patterns are obtained by iteratively warping one pattern onto another, whereas the temporal relationships are obtained from their relative times of occurrence within videos. These motion segments and their spatio-temporal relationships are represented as a graph, where the nodes are the statistical distributions, and the edges have geometric transformations between motion patterns transformed to Lie space, as their attributes. Two activity instances are then compared by estimating the cost of attributed inexact graph matching. We demonstrate the application of our framework in the analysis of American football plays, a typical multi-agent activity. The performance analysis of our method shows that it is feasible and easily generalizable.