Computer vision theory: The lack thereof
Computer Vision, Graphics, and Image Processing
Robust regression and outlier detection
Robust regression and outlier detection
Line-Drawing Interpretation: Bilateral Symmetry
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the Detection of the Axes of Symmetry of Symmetric and Almost Symmetric Planar Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
Application of the Karhunen-Loeve Procedure for the Characterization of Human Faces
IEEE Transactions on Pattern Analysis and Machine Intelligence
Parametric Model Fitting: From Inlier Characterization to Outlier Detection
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Adaptive Segmentation of Range Images
IEEE Transactions on Pattern Analysis and Machine Intelligence
International Journal of Computer Vision
Direct Least Square Fitting of Ellipses
IEEE Transactions on Pattern Analysis and Machine Intelligence
Further five-point fit ellipse fitting
Graphical Models and Image Processing
Robust Parameter Estimation in Computer Vision
SIAM Review
A Modified Version of the K-Means Algorithm with a Distance Based on Cluster Symmetry
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust Estimation for Range Image Segmentation and Reconstruction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Symmetry as a Continuous Feature
IEEE Transactions on Pattern Analysis and Machine Intelligence
Optical Flow from a Least-Trimmed Squares Based Adaptive Approach
ICPR '00 Proceedings of the International Conference on Pattern Recognition - Volume 3
MDPE: A Very Robust Estimator for Model Fitting and Range Image Segmentation
International Journal of Computer Vision
Symmetry parameters for 3D pattern classification
Pattern Recognition Letters
Hi-index | 0.10 |
The pattern recognition and computer vision communities often employ robust methods for model fitting. In particular, high breakdown-point methods such as least median of squares (LMedS) and least trimmed squares (LTS) have often been used in situations where the data are contaminated with outliers. However, though the breakdown point of these methods can be as high as 50% (they can be robust to up to 50% contamination), they can break down at unexpectedly lower percentages when the outliers are clustered. In this paper, we demonstrate the fragility of LMedS and LTS and analyze the reasons that cause the fragility of these methods in the situation when a large percentage of clustered outliers exist in the data. We adapt the concept of "symmetry distance" to formulate an improved regression method, called the least trimmed symmetry distance (LTSD). Experimental results are presented to show that the LTSD performs better than LMedS and LTS under a large percentage of clustered outliers and large standard variance of inliers.