Variable kernel density estimation based robust regression and its applications

  • Authors:
  • Zhen Zhang;Yanning Zhang

  • Affiliations:
  • -;-

  • Venue:
  • Neurocomputing
  • Year:
  • 2014

Quantified Score

Hi-index 0.01

Visualization

Abstract

Robust estimation with high break down point is an important and fundamental topic in computer vision, machine learning and many other areas. Traditional robust estimator with a break down point more than 50%, for illustration, Random Sampling Consensus and its derivatives, needs a user specified scale of inliers such that inliers can be distinguished from outliers, but in many applications, we do not have any a priori of the scale of inliers, so an empirical value is usually specified. In recent years, a group of Kernel Density Estimation (KDE) based robust estimators has been proposed to solve this problem. However, as the most important parameter, bandwidth, for KDE is highly correlated to the scale of inliers, these methods turned out to be a scale estimator for inliers, and it is not an easy work to estimate the scale of inliers. Thus, the authors build up a robust estimator based on Variable Kernel Density Estimation (VKDE). Compared to KDE, VKDE estimates bandwidth out of local information of samples by using K-Nearest-Neighbor method instead of estimating bandwidth from the scale of inliers. Thus the estimation for the scale of inliers can be omitted. Furthermore, as variable bandwidth technique is applied, the proposed method uses smaller bandwidths for the areas where samples are more densely distributed. As inliers are much more densely distributed than outliers, the proposed method achieved a higher resolution for inliers, and then the peak of estimated density will be closer to the point near which samples are most densely distributed. At last, the proposed method is compared to two most widely used robust estimators, Random Sampling Consensus and Least Median Square. From the result we can see that it has higher precision than those two methods.