PTZ camera calibration for augmented virtual environments

  • Authors:
  • Lu Wang;Suya You;Ulrich Neumann

  • Affiliations:
  • Computer Science Department, University of Southern California;Computer Science Department, University of Southern California;Computer Science Department, University of Southern California

  • Venue:
  • ICME'09 Proceedings of the 2009 IEEE international conference on Multimedia and Expo
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

Augmented Virtual Environments (AVE) are very effective in the application of surveillance, in which multiple video streams are projected onto a 3D urban model for better visualization and comprehension of the dynamic scenes. One of the key issues in creating such systems is to estimate the parameters of each camera including the intrinsic parameters and its pose relative to the 3D model. Nowadays, PTZ cameras are popular in this kind of applications. How to rapidly calibrate them at an arbitrary PTZ setting is not clear in the literature. We propose an efficient approach with two steps. In the first step, panoramic images are generated at a set of zooms. The images composing these panoramas are calibrated and stored in a database. In the second step, an image is acquired at an arbitrary PTZ setting. Its best matching image in the database is found by using an efficient local feature recognition technique. Based on this image, the camera parameters at the new PTZ setting can be estimated.