Quantization and clustering with Bregman divergences
Journal of Multivariate Analysis
Theory and Use of the EM Algorithm
Foundations and Trends in Signal Processing
Divergence-based vector quantization
Neural Computation
Tighter PAC-Bayes bounds through distribution-dependent priors
Theoretical Computer Science
Bayesian model diagnostics using functional Bregman divergence
Journal of Multivariate Analysis
Hi-index | 754.84 |
A class of distortions termed functional Bregman divergences is defined, which includes squared error and relative entropy. A functional Bregman divergence acts on functions or distributions, and generalizes the standard Bregman divergence for vectors and a previous pointwise Bregman divergence that was defined for functions. A recent result showed that the mean minimizes the expected Bregman divergence. The new functional definition enables the extension of this result to the continuous case to show that the mean minimizes the expected functional Bregman divergence over a set of functions or distributions. It is shown how this theorem applies to the Bayesian estimation of distributions. Estimation of the uniform distribution from independent and identically drawn samples is presented as a case study.