Regularized Principal Manifolds
EuroCOLT '99 Proceedings of the 4th European Conference on Computational Learning Theory
Regularized principal manifolds
The Journal of Machine Learning Research
Error bounds for correlation clustering
ICML '05 Proceedings of the 22nd international conference on Machine learning
Generalization Bounds for K-Dimensional Coding Schemes in Hilbert Spaces
ALT '08 Proceedings of the 19th international conference on Algorithmic Learning Theory
K-dimensional coding schemes in Hilbert spaces
IEEE Transactions on Information Theory
The Sample Complexity of Dictionary Learning
The Journal of Machine Learning Research
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Convergence of Distributed Asynchronous Learning Vector Quantization Algorithms
The Journal of Machine Learning Research
A statistical view of clustering performance through the theory of U-processes
Journal of Multivariate Analysis
Hi-index | 754.90 |
We obtain minimax lower and upper bounds for the expected distortion redundancy of empirically designed vector quantizers. We show that the mean-squared distortion of a vector quantizer designed from n independent and identically distributed (i.i.d.) data points using any design algorithm is at least Ω(n-1/2) away from the optimal distortion for some distribution on a bounded subset of ℛ d. Together with existing upper bounds this result shows that the minimax distortion redundancy for empirical quantizer design, as a function of the size of the training data, is asymptotically on the order of n-1/2. We also derive a new upper bound for the performance of the empirically optimal quantizer