Parallel and Distributed Computation: Numerical Methods
Parallel and Distributed Computation: Numerical Methods
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Distributed in-network channel decoding
IEEE Transactions on Signal Processing
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Applications of support vector machines to speech recognition
IEEE Transactions on Signal Processing
Real-Time Detection of Driver Cognitive Distraction Using Support Vector Machines
IEEE Transactions on Intelligent Transportation Systems
Distributed Parallel Support Vector Machines in Strongly Connected Networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This paper develops algorithms to train linear support vector machines (SVMs) when training data are distributed across different nodes and their communication to a centralized node is prohibited due to, for example, communication overhead or privacy reasons. To accomplish this goal, the centralized linear SVM problem is cast as the solution of coupled decentralized convex optimization subproblems with consensus constraints on the parameters defining the classifier. Using the method of multipliers, distributed training algorithms are derived that do not exchange elements from the training set among nodes. The communications overhead of the novel approach is fixed and fully determined by the topology of the network instead of being determined by the size of the training sets as it is the case for existing incremental approaches. An online algorithm where data arrive sequentially to the nodes is also developed. Simulated tests illustrate the performance of the algorithms.