MLESAC: a new robust estimator with application to estimating image geometry
Computer Vision and Image Understanding - Special issue on robusst statistical techniques in image understanding
Mean Shift: A Robust Approach Toward Feature Space Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Multiple View Geometry in Computer Vision
Multiple View Geometry in Computer Vision
Matching with PROSAC " Progressive Sample Consensus
CVPR '05 Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Volume 1 - Volume 01
Guided-MLESAC: Faster Image Transform Estimation by Using Matching Priors
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Generalized Kernel Consensus-Based Robust Estimator
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized projection based M-estimator: Theory and applications
CVPR '11 Proceedings of the 2011 IEEE Conference on Computer Vision and Pattern Recognition
Simultaneously Fitting and Segmenting Multiple-Structure Data with Outliers
IEEE Transactions on Pattern Analysis and Machine Intelligence
Generalized Projection-Based M-Estimator
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust kernel density estimation
The Journal of Machine Learning Research
Hi-index | 0.01 |
Robust estimation with high break down point is an important and fundamental topic in computer vision, machine learning and many other areas. Traditional robust estimator with a break down point more than 50%, for illustration, Random Sampling Consensus and its derivatives, needs a user specified scale of inliers such that inliers can be distinguished from outliers, but in many applications, we do not have any a priori of the scale of inliers, so an empirical value is usually specified. In recent years, a group of Kernel Density Estimation (KDE) based robust estimators has been proposed to solve this problem. However, as the most important parameter, bandwidth, for KDE is highly correlated to the scale of inliers, these methods turned out to be a scale estimator for inliers, and it is not an easy work to estimate the scale of inliers. Thus, the authors build up a robust estimator based on Variable Kernel Density Estimation (VKDE). Compared to KDE, VKDE estimates bandwidth out of local information of samples by using K-Nearest-Neighbor method instead of estimating bandwidth from the scale of inliers. Thus the estimation for the scale of inliers can be omitted. Furthermore, as variable bandwidth technique is applied, the proposed method uses smaller bandwidths for the areas where samples are more densely distributed. As inliers are much more densely distributed than outliers, the proposed method achieved a higher resolution for inliers, and then the peak of estimated density will be closer to the point near which samples are most densely distributed. At last, the proposed method is compared to two most widely used robust estimators, Random Sampling Consensus and Least Median Square. From the result we can see that it has higher precision than those two methods.