Parallel and Distributed Computation: Numerical Methods
Parallel and Distributed Computation: Numerical Methods
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Detecting Concept Drift with Support Vector Machines
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Distributed localization in wireless sensor networks: a quantitative comparison
Computer Networks: The International Journal of Computer and Telecommunications Networking - Special issue: Wireless sensor networks
Support Vector Machines in Handwritten Digits Classification
ISDA '05 Proceedings of the 5th International Conference on Intelligent Systems Design and Applications
Fast Kernel Classifiers with Online and Active Learning
The Journal of Machine Learning Research
Distributed in-network channel decoding
IEEE Transactions on Signal Processing
A collaborative training algorithm for distributed learning
IEEE Transactions on Information Theory
Calibrating noise to sensitivity in private data analysis
TCC'06 Proceedings of the Third conference on Theory of Cryptography
Applications of support vector machines to speech recognition
IEEE Transactions on Signal Processing
Real-Time Detection of Driver Cognitive Distraction Using Support Vector Machines
IEEE Transactions on Intelligent Transportation Systems
Distributed support vector machines
IEEE Transactions on Neural Networks
Distributed Parallel Support Vector Machines in Strongly Connected Networks
IEEE Transactions on Neural Networks
Distributed sparse linear regression
IEEE Transactions on Signal Processing
Foundations and Trends® in Machine Learning
Distributed static linear Gaussian models using consensus
Neural Networks
Distributed customer behavior prediction using multiplex data: A collaborative MK-SVM approach
Knowledge-Based Systems
Computational Optimization and Applications
Hi-index | 0.00 |
This paper develops algorithms to train support vector machines when training data are distributed across different nodes, and their communication to a centralized processing unit is prohibited due to, for example, communication complexity, scalability, or privacy reasons. To accomplish this goal, the centralized linear SVM problem is cast as a set of decentralized convex optimization sub-problems (one per node) with consensus constraints on the wanted classifier parameters. Using the alternating direction method of multipliers, fully distributed training algorithms are obtained without exchanging training data among nodes. Different from existing incremental approaches, the overhead associated with inter-node communications is fixed and solely dependent on the network topology rather than the size of the training sets available per node. Important generalizations to train nonlinear SVMs in a distributed fashion are also developed along with sequential variants capable of online processing. Simulated tests illustrate the performance of the novel algorithms.