Epiflow Based Stereo Fusion

  • Authors:
  • Hongsheng Zhang;Shahriar Negahdaripour

  • Affiliations:
  • Mako Surgical Corp., Ft. Lauderdale FL 33317, USA;University of Miami, Coral Gables, FL 33124, USA

  • Venue:
  • IbPRIA '07 Proceedings of the 3rd Iberian conference on Pattern Recognition and Image Analysis, Part I
  • Year:
  • 2007

Quantified Score

Hi-index 0.01

Visualization

Abstract

3-D reconstruction from images sequences has been the center topic of computer vision. Real-time applications call for causal processing of stereo sequences, as they are acquired, covering different regions of the scene. The first step is to compute the current stereo disparity, and recursive map building often requires fusing with the previous estimate. In this paper, the epiflow framework [1], originally proposed for establishing matches among stereo feature pairs is generalized to devise an iterative causal algorithm for stereo disparity map fusion. In the context of disparity fusion, quadruplet correspondence of the epiflow tracking algorithm becomes reminiscent of the "closest point" of the 3-D ICP algorithm. Unlike ICP, the 2-D epiflow framework permits incorporating both photometric and geometrical constraints, estimation of the stereo rig motion as supplementary information, as well as identifying local inconsistencies between the two disparity maps. Experiments with real data validate the proposed approach, and improved converge compared to the ICP algorithm.