Automatic camera control using unobtrusive vision and audio tracking

  • Authors:
  • Abhishek Ranjan;Rorik Henrikson;Jeremy Birnholtz;Ravin Balakrishnan;Dana Lee

  • Affiliations:
  • University of Toronto, Ontario;University of Toronto, Ontario;University of Toronto, Ontario and Cornell University - Ithaca, NY;University of Toronto, Ontario;Ryerson University, Toronto, Ontario

  • Venue:
  • Proceedings of Graphics Interface 2010
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

While video can be useful for remotely attending and archiving meetings, the video itself is often dull and difficult to watch. One key reason for this is that, except in very high-end systems, little attention has been paid to the production quality of the video being captured. The video stream from a meeting often lacks detail and camera shots rarely change unless a person is tasked with operating the camera. This stands in stark contrast to live television, where a professional director creates engaging video by juggling multiple cameras to provide a variety of interesting views. In this paper, we applied lessons from television production to the problem of using automated camera control and selection to improve the production quality of meeting video. In an extensible and robust approach, our system uses off-the-shelf cameras and microphones to unobtrusively track the location and activity of meeting participants, control three cameras, and cut between these to create video with a variety of shots and views, in real-time. Evaluation by users and independent coders suggests promising initial results and directions for future work.