The Strength of Weak Learnability
Machine Learning
Looking for lumps: boosting and bagging for density estimation
Computational Statistics & Data Analysis - Nonlinear methods and data mining
Robust blind source separation by beta divergence
Neural Computation
Robust Principal Component Analysis with Adaptive Selection for Tuning Parameters
The Journal of Machine Learning Research
Information geometry of U-Boost and Bregman divergence
Neural Computation
Journal of Multivariate Analysis
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Density estimation with stagewise optimization of the empirical risk
Machine Learning
Robust parameter estimation with a small bias against heavy contamination
Journal of Multivariate Analysis
Probability density estimation from optimally condensed data samples
IEEE Transactions on Pattern Analysis and Machine Intelligence
Hi-index | 0.00 |
This paper is concerned with density estimation based on the stagewise minimization of the U-divergence. The U-divergence is a general divergence measure involving a convex function U which includes the Kullback-Leibler divergence and the L 2 norm as special cases. The algorithm to yield the density estimator is closely related to the boosting algorithm and it is shown that the usual kernel density estimator can also be seen as a special case of the proposed estimator. Non-asymptotic error bounds of the proposed estimators are developed and numerical experiments show that the proposed estimators often perform better than several existing methods for density estimation.