What Do the Sun and the Sky Tell Us About the Camera?

  • Authors:
  • Jean-François Lalonde;Srinivasa G. Narasimhan;Alexei A. Efros

  • Affiliations:
  • School of Computer Science, Carnegie Mellon University, Pittsburgh, USA 15213;School of Computer Science, Carnegie Mellon University, Pittsburgh, USA 15213;School of Computer Science, Carnegie Mellon University, Pittsburgh, USA 15213

  • Venue:
  • International Journal of Computer Vision
  • Year:
  • 2010

Quantified Score

Hi-index 0.00

Visualization

Abstract

As the main observed illuminant outdoors, the sky is a rich source of information about the scene. However, it is yet to be fully explored in computer vision because its appearance in an image depends on the sun position, weather conditions, photometric and geometric parameters of the camera, and the location of capture. In this paper, we analyze two sources of information available within the visible portion of the sky region: the sun position, and the sky appearance. By fitting a model of the predicted sun position to an image sequence, we show how to extract camera parameters such as the focal length, and the zenith and azimuth angles. Similarly, we show how we can extract the same parameters by fitting a physically-based sky model to the sky appearance. In short, the sun and the sky serve as geometric calibration targets, which can be used to annotate a large database of image sequences. We test our methods on a high-quality image sequence with known camera parameters, and obtain errors of less that 1% for the focal length, 1° for azimuth angle and 3° for zenith angle. We also use our methods to calibrate 22 real, low-quality webcam sequences scattered throughout the continental US, and show deviations below 4% for focal length, and 3° for the zenith and azimuth angles. Finally, we demonstrate that by combining the information available within the sun position and the sky appearance, we can also estimate the camera geolocation, as well as its geometric parameters. Our method achieves a mean localization error of 110 km on real, low-quality Internet webcams. The estimated viewing and illumination geometry of the scene can be useful for a variety of vision and graphics tasks such as relighting, appearance analysis and scene recovery.