Global convergence and empirical consistency of the generalized Lloyd algorithm
IEEE Transactions on Information Theory
Vector quantization and signal compression
Vector quantization and signal compression
Journal of Optimization Theory and Applications
Model Selection and Error Estimation
Machine Learning
Prediction, Learning, and Games
Prediction, Learning, and Games
Clustering with Bregman Divergences
The Journal of Machine Learning Research
IEEE Transactions on Information Theory
Quantization and the method of -means
IEEE Transactions on Information Theory
Least squares quantization in PCM
IEEE Transactions on Information Theory
On the optimality of conditional expectation as a Bregman predictor
IEEE Transactions on Information Theory
On the Performance of Clustering in Hilbert Spaces
IEEE Transactions on Information Theory
Functional Bregman Divergence and Bayesian Estimation of Distributions
IEEE Transactions on Information Theory
Hi-index | 0.00 |
This paper deals with the problem of quantization of a random variable X taking values in a separable and reflexive Banach space, and with the related question of clustering independent random observations distributed as X. To this end, we use a quantization scheme with a class of distortion measures called Bregman divergences, and provide conditions ensuring the existence of an optimal quantizer and an empirically optimal quantizer. Rates of convergence are also discussed.