Optimization by Vector Space Methods
Optimization by Vector Space Methods
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Tailoring density estimation via reproducing kernel moment matching
Proceedings of the 25th international conference on Machine learning
Outlier Detection with Kernel Density Functions
MLDM '07 Proceedings of the 5th international conference on Machine Learning and Data Mining in Pattern Recognition
Support Vector Machines
Outlier Detection with the Kernelized Spatial Depth Function
IEEE Transactions on Pattern Analysis and Machine Intelligence
Robust support vector machine training via convex outlier ablation
AAAI'06 Proceedings of the 21st national conference on Artificial intelligence - Volume 1
Robustness of Kernel Based Regression: A Comparison of Iterative Weighting Schemes
ICANN '09 Proceedings of the 19th International Conference on Artificial Neural Networks: Part I
Detecting influential observations in Kernel PCA
Computational Statistics & Data Analysis
IEEE Transactions on Pattern Analysis and Machine Intelligence
Radial kernels and their reproducing kernel Hilbert spaces
Journal of Complexity
Robust Regularized Kernel Regression
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Probability density estimation from optimally condensed data samples
IEEE Transactions on Pattern Analysis and Machine Intelligence
An Expanded Theoretical Treatment of Iteration-Dependent Majorize-Minimize Algorithms
IEEE Transactions on Image Processing
Hi-index | 0.00 |
We propose a method for nonparametric density estimation that exhibits robustness to contamination of the training sample. This method achieves robustness by combining a traditional kernel density estimator (KDE) with ideas from classical M-estimation. We interpret the KDE based on a positive semi-definite kernel as a sample mean in the associated reproducing kernel Hilbert space. Since the sample mean is sensitive to outliers, we estimate it robustly via M-estimation, yielding a robust kernel density estimator (RKDE). An RKDE can be computed efficiently via a kernelized iteratively re-weighted least squares (IRWLS) algorithm. Necessary and sufficient conditions are given for kernelized IRWLS to converge to the global minimizer of the M-estimator objective function. The robustness of the RKDE is demonstrated with a representer theorem, the influence function, and experimental results for density estimation and anomaly detection.