Neural networks for pattern recognition
Neural networks for pattern recognition
Distributed optimization in sensor networks
Proceedings of the 3rd international symposium on Information processing in sensor networks
Distributed average consensus with least-mean-square deviation
Journal of Parallel and Distributed Computing
Cascade RSVM in Peer-to-Peer Networks
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
Interval consensus: From quantized gossip to voting
ICASSP '09 Proceedings of the 2009 IEEE International Conference on Acoustics, Speech and Signal Processing
Graph-based classification of multiple observation sets
Pattern Recognition
An overview of statistical learning theory
IEEE Transactions on Neural Networks
Distributed support vector machines
IEEE Transactions on Neural Networks
Distributed Parallel Support Vector Machines in Strongly Connected Networks
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
We propose an algorithm to learn from distributed data on a network of arbitrarily connected machines without exchange of the data-points. Parts of the dataset are processed locally at each machine, and then the consensus communication algorithm is employed to consolidate the results. This iterative two stage process converges as if the entire dataset had been on a single machine. The principal contribution of this paper is the proof of convergence of the distributed learning process in the general case that the learning algorithm is a contraction. Moreover, we derive the distributed update equation of a feed-forward neural network with back-propagation for the purpose of verifying the theoretical results. We employ a toy classification example and a real world binary classification dataset.