Generating synthetic meta-data for georeferenced video management

  • Authors:
  • Sakire Arslan Ay;Seon Ho Kim;Roger Zimmermann

  • Affiliations:
  • University of Southern California, Los Angeles, CA;University of Southern California, Los Angeles, CA;National University of Singapore, Singapore

  • Venue:
  • Proceedings of the 18th SIGSPATIAL International Conference on Advances in Geographic Information Systems
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

Recently various sensors, such as GPS and compass devices, can be cost-effectively manufactured and this allows their deployment in conjunction with mobile video cameras. Hence, recorded clips can automatically be annotated with geospatial information and the resulting georeferenced videos may be used in various Geographic Information System (GIS) applications. However, the research community is lacking large-scale and realistic test datasets of such sensor-fused information to evaluate their techniques since collecting real-world test data requires considerable time and effort. To fill this void, we propose an approach for generating synthetic video meta-data with realistic geospatial properties for mobile video management research. We highlight the essential aspects of the georeferenced video meta-data and present an approach to simulate the behavioral patterns of mobile cameras in the synthetic data. The data generation process can be customized through user parameters for a variety of GIS applications that use mobile videos. We demonstrate the feasibility and applicability of the proposed approach by providing comparisons with real-world data.