Robust feature point matching by preserving local geometric consistency

  • Authors:
  • Ouk Choi;In So Kweon

  • Affiliations:
  • Korea Advanced Institute of Science and Technology, 335 Gwahangno, Yuseong-gu, Daejeon 305-701, Republic of Korea;Korea Advanced Institute of Science and Technology, 335 Gwahangno, Yuseong-gu, Daejeon 305-701, Republic of Korea

  • Venue:
  • Computer Vision and Image Understanding
  • Year:
  • 2009

Quantified Score

Hi-index 0.00

Visualization

Abstract

We present a method for matching feature points robustly across widely separated images. In general, it is difficult to match feature points correctly by using only the similarity between local descriptors. In our approach, the correspondence problem is formulated as an optimization problem with one-to-one correspondence constraints. A novel objective function is defined to preserve local image-to-image affine transformations across correspondences. This objective function enables our method to cope with significant viewpoint or scale changes between images, unlike previous methods that relied on the assumption that the distance or orientation between neighboring feature points are preserved across images. A relaxation algorithm is proposed for maximizing the objective function, which imposes one-to-one correspondence constraints, unlike conventional relaxation labeling algorithms that impose many-to-one correspondence constraints. Experimental evaluation shows that our method is robust with respect to significant viewpoint changes, scale changes, and nonrigid deformations between images, in the presence of repeated textures that make feature point matching more ambiguous. Our method is also applied to object recognition in cluttered environments, giving some promising results.