Global convergence and empirical consistency of the generalized Lloyd algorithm
IEEE Transactions on Information Theory
Parallel and distributed computation: numerical methods
Parallel and distributed computation: numerical methods
Adaptive algorithms and stochastic approximations
Adaptive algorithms and stochastic approximations
Vector quantization and signal compression
Vector quantization and signal compression
SCG '94 Proceedings of the tenth annual symposium on Computational geometry
A space quantization method for numerical integration
Journal of Computational and Applied Mathematics
On-line learning and stochastic approximations
On-line learning in neural networks
A statistical study of on-line learning
On-line learning in neural networks
Foundations of Quantization for Probability Distributions
Foundations of Quantization for Probability Distributions
Graphs, Networks and Algorithms
Graphs, Networks and Algorithms
Distributed Control of Robotic Networks: A Mathematical Approach to Motion Coordination Algorithms
Distributed Control of Robotic Networks: A Mathematical Approach to Motion Coordination Algorithms
Multiagent coverage algorithms with gossip communication: control systems on the space of partitions
ACC'09 Proceedings of the 2009 conference on American Control Conference
Distributed training strategies for the structured perceptron
HLT '10 Human Language Technologies: The 2010 Annual Conference of the North American Chapter of the Association for Computational Linguistics
The minimax distortion redundancy in empirical quantizer design
IEEE Transactions on Information Theory
On the training distortion of vector quantizers
IEEE Transactions on Information Theory
Least squares quantization in PCM
IEEE Transactions on Information Theory
Individual convergence rates in empirical vector quantizer design
IEEE Transactions on Information Theory
IEEE Transactions on Information Theory
On the Performance of Clustering in Hilbert Spaces
IEEE Transactions on Information Theory
Warped K-Means: An algorithm to cluster sequentially-distributed data
Information Sciences: an International Journal
Hi-index | 0.00 |
Motivated by the problem of effectively executing clustering algorithms on very large data sets, we address a model for large scale distributed clustering methods. To this end, we briefly recall some standards on the quantization problem and some results on the almost sure convergence of the competitive learning vector quantization (CLVQ) procedure. A general model for linear distributed asynchronous algorithms well adapted to several parallel computing architectures is also discussed. Our approach brings together this scalable model and the CLVQ algorithm, and we call the resulting technique the distributed asynchronous learning vector quantization algorithm (DALVQ). An in-depth analysis of the almost sure convergence of the DALVQ algorithm is performed. A striking result is that we prove that the multiple versions of the quantizers distributed among the processors in the parallel architecture asymptotically reach a consensus almost surely. Furthermore, we also show that these versions converge almost surely towards the same nearly optimal value for the quantization criterion.