AnySURF: flexible local features computation

  • Authors:
  • Eran Sadeh-Or;Gal A. Kaminka

  • Affiliations:
  • Computer Science Department, Bar Ilan University, Israel;Computer Science Department, Bar Ilan University, Israel

  • Venue:
  • Robot Soccer World Cup XV
  • Year:
  • 2012

Quantified Score

Hi-index 0.00

Visualization

Abstract

Many vision-based tasks for autonomous robotics are based on feature matching algorithms, finding point correspondences between two images. Unfortunately, existing algorithms for such tasks require significant computational resources and are designed under the assumption that they will run to completion and only then return a complete result. Since partial results--a subset of all features in the image--are often sufficient, we propose in this paper a computationally-flexible algorithm, where results monotonically increase in quality, given additional computation time. The proposed algorithm, coined AnySURF (Anytime SURF), is based on the SURF scale- and rotation-invariant interest point detector and descriptor. We achieve flexibility by re-designing several major steps, mainly the feature search process, allowing results with increasing quality to be accumulated. We contrast different design choices for AnySURF and evaluate the use of AnySURF in a series of experiments. Results are promising, and show the potential for dynamic anytime performance, robust to the available computation time.