Robust estimation in very small samples
Computational Statistics & Data Analysis
A computational strategy for doubly smoothed MLE exemplified in the normal mixture model
Computational Statistics & Data Analysis
Dual divergence estimators and tests: Robustness results
Journal of Multivariate Analysis
Data reduction for weighted and outlier-resistant clustering
Proceedings of the twenty-third annual ACM-SIAM symposium on Discrete Algorithms
An effective 2-stage method for removing impulse noise in images
Journal of Visual Communication and Image Representation
Hi-index | 0.03 |
A smoothing principle for M-estimators is proposed. The smoothing depends on the sample size so that the resulting smoothed M-estimator coincides with the initial M-estimator when n-~. The smoothing principle is motivated by an analysis of the requirements in the proof of the Cramer-Rao bound. The principle can be applied to every M-estimator. A simulation study is carried out where smoothed Huber, ML-, and Bisquare M-estimators are compared with their non-smoothed counterparts and with Pitman estimators on data generated from several distributions with and without estimated scale. This leads to encouraging results for the smoothed estimators, and particularly the smoothed Huber estimator, as they improve upon the initial M-estimators particularly in the tail areas of the distributions of the estimators. The results are backed up by small sample asymptotics.