Optimal gossip algorithm for distributed consensus SVM training in wireless sensor networks
DSP'09 Proceedings of the 16th international conference on Digital Signal Processing
Parallel multiclass classification using SVMs on GPUs
Proceedings of the 3rd Workshop on General-Purpose Computation on Graphics Processing Units
Consensus-based distributed linear support vector machines
Proceedings of the 9th ACM/IEEE International Conference on Information Processing in Sensor Networks
Consensus-Based Distributed Support Vector Machines
The Journal of Machine Learning Research
Peer-to-peer distributed text classifier learning in PADMINI
Statistical Analysis and Data Mining
CloudSVM: training an SVM classifier in cloud computing systems
ICPCA/SWS'12 Proceedings of the 2012 international conference on Pervasive Computing and the Networked World
Distributed machine learning in networks by consensus
Neurocomputing
Hi-index | 0.00 |
In this paper, we propose a distributed parallel support vector machine (DPSVM) training mechanism in a configurable network environment for distributed data mining. The basic idea is to exchange support vectors among a strongly connected network (SCN) so that multiple servers may work concurrently on distributed data set with limited communication cost and fast training speed. The percentage of servers that can work in parallel and the communication overhead may be adjusted through network configuration. The proposed algorithm further speeds up through online implementation and synchronization. We prove that the global optimal classifier can be achieved iteratively over an SCN. Experiments on a real-world data set show that the computing time scales well with the size of the training data for most networks. Numerical results show that a randomly generated SCN may achieve better performance than the state of the art method, cascade SVM, in terms of total training time.