Line Fitting in a Noisy Image

  • Authors:
  • I. Weiss

  • Affiliations:
  • Univ. of Maryland, College Park

  • Venue:
  • IEEE Transactions on Pattern Analysis and Machine Intelligence
  • Year:
  • 1989

Quantified Score

Hi-index 0.14

Visualization

Abstract

The conventional least-squared-distance method of fitting a line to a set of data points is unreliable when the amount of random noise in the input (such as an image) is significant compared with the amount of data correlated to the line itself. Points which are far away from the line (outliers) are usually just noise, but they contribute the most to the distance averaging, skewing the line from its correct position. The author presents a statistical method of separating the data of interest from random noise, using a maximum-likelihood principle.