Joint Recognition of Complex Events and Track Matching

  • Authors:
  • Michael T. Chan;Anthony Hoogs;Rahul Bhotika;Amitha Perera;John Schmiederer;Gianfranco Doretto

  • Affiliations:
  • GE Global Research One Research Circle, NY;GE Global Research One Research Circle, NY;GE Global Research One Research Circle, NY;GE Global Research One Research Circle, NY;GE Global Research One Research Circle, NY;GE Global Research One Research Circle, NY

  • Venue:
  • CVPR '06 Proceedings of the 2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition - Volume 2
  • Year:
  • 2006

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a novel method for jointly performing recognition of complex events and linking fragmented tracks into coherent, long-duration tracks. Many event recognition methods require highly accurate tracking, and may fail when tracks corresponding to event actors are fragmented or partially missing. However, these conditions occur frequently from occlusions, traffic and tracking errors. Recently, methods have been proposed for linking track fragments from multiple objects under these difficult conditions. Here, we develop a method for solving these two problems jointly. A hypothesized event model, represented as a Dynamic Bayes Net, supplies data-driven constraints on the likelihood of proposed track fragment matches. These event-guided constraints are combined with appearance and kinematic constraints used in the previous track linking formulation. The result is the most likely track linking solution given the event model, and the highest event score given all of the track fragments. The event model with the highest score is determined to have occurred, if the score exceeds a threshold. Results demonstrated on a busy scene of airplane servicing activities, where many non-event movers and long fragmented tracks are present, show the promise of the approach to solving the joint problem.