A course in density estimation
A course in density estimation
Hierarchical mixtures of experts and the EM algorithm
Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Unsupervised Learning of Finite Mixture Models
IEEE Transactions on Pattern Analysis and Machine Intelligence
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
A note on density model size testing
IEEE Transactions on Information Theory
Radial basis function networks and complexity regularization in function learning
IEEE Transactions on Neural Networks
Density estimation with stagewise optimization of the empirical risk
Machine Learning
Parametric estimation and tests through divergences and the duality technique
Journal of Multivariate Analysis
Hi-index | 0.00 |
Let f be an unknown multivariate density belonging to a prespecified parametric class of densities, Fk, where k is unknown, but Fk⊂Fk+1 for all k and each Fk has finite Vapnik-Chervonenkis dimension. Given an i.i.d. sample of size n drawn from f, we show that it is possible to select automatically, and without extra restrictions on f, an estimate fn,k with the property that E{∫|fn,k-f|}=O(1/√n). Our method is inspired by the combinatorial tools developed in Devroye and Lugosi (Combinatorial Methods in Density Estimation, Springer, New York, 2001) and it includes a wide range of density models, such as mixture models or exponential families.