Dense disparity estimation from omnidirectional images

  • Authors:
  • Zafer Arican;Pascal Frossard

  • Affiliations:
  • Ecole Polytechnique Fédérale de Lausanne (EPFL), Signal Processing Institute, Switzerland;Ecole Polytechnique Fédérale de Lausanne (EPFL), Signal Processing Institute, Switzerland

  • Venue:
  • AVSS '07 Proceedings of the 2007 IEEE Conference on Advanced Video and Signal Based Surveillance
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

This paper addresses the problem of dense estimation of disparities between omnidirectional images, in a spherical framework. Omnidirectional imaging certainly represents important advantages for the representation and processing of the plenoptic function in 3D scenes for applications in localization, or depth estimation for example. In this context, we propose to perform disparity estimation directly in a spherical framework, in order to avoid discrepancies due to inexact projections of omnidirectional images onto planes. We first perform rectification of the omnidirectional images in the spherical domain. Then we develop a global energy minimization algorithm based on the graph-cut algorithm, in order to perform disparity estimation on the sphere. Experimental results show that the proposed algorithm outperforms typical methods as the ones based on block matching, for both a simple synthetic scene, and complex natural scenes. The proposed method shows promising performances for dense disparity estimation and can be extended efficiently to networks of several camera sensors.