People tracking across two distant self-calibrated cameras

  • Authors:
  • Roman Pflugfelder;Horst Bischof

  • Affiliations:
  • Video and Safety Technology, Austrian Research Centers GmbH - ARC, A-1220 Vienna, Austria;Institute for Computer Graphics and Vision, Graz University of Technology, A-8010, Austria

  • Venue:
  • AVSS '07 Proceedings of the 2007 IEEE Conference on Advanced Video and Signal Based Surveillance
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

People tracking is of fundamental importance in multicamera surveillance systems. In recent years, many approaches for multi-camera tracking have been discussed. Most methods use either various image features or the geometric relation between the cameras or both as a cue. It is a desire to know the geometry for distant cameras, because geometry is not influenced by, for example, drastic changes in object appearance or in scene illumination. However, the determination of the camera geometry is cumbersome. The paper tries to solve this problem and contributes in two different ways. On the one hand, an approach is presented that calibrates two distant cameras automatically. We continue previous work and focus especially on the calibration of the extrinsic parameters. Point correspondences are used for this task which are acquired by detecting points on top of people’s heads. On the other hand, qualitative experimental results with the PETS 2006 benchmark data show that the self-calibration is accurate enough for a solely geometric tracking of people across distant cameras. Reliable features for a matching are hardly available in such cases.