Neural networks and the bias/variance dilemma
Neural Computation
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Towards scalable support vector machines using squashing
Proceedings of the sixth ACM SIGKDD international conference on Knowledge discovery and data mining
Proximal support vector machine classifiers
Proceedings of the seventh ACM SIGKDD international conference on Knowledge discovery and data mining
Ensembling neural networks: many could be better than all
Artificial Intelligence
The Bayesian Committee Support Vector Machine
ICANN '01 Proceedings of the International Conference on Artificial Neural Networks
Support Vector Machines with Clustering for Training with Very Large Datasets
SETN '02 Proceedings of the Second Hellenic Conference on AI: Methods and Applications of Artificial Intelligence
Support Vector Mixture for Classification and Regression Problems
ICPR '98 Proceedings of the 14th International Conference on Pattern Recognition-Volume 1 - Volume 1
Lagrangian support vector machines
The Journal of Machine Learning Research
Classifying large data sets using SVMs with hierarchical clusters
Proceedings of the ninth ACM SIGKDD international conference on Knowledge discovery and data mining
Core Vector Machines: Fast SVM Training on Very Large Data Sets
The Journal of Machine Learning Research
A hierarchical and parallel method for training support vector machines
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
A fast iterative nearest point algorithm for support vector machine classifier design
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Support vector machines (SVMs) have been accepted as a fashionable method in machine learning community, but they cannot be easily scaled to handle large scale problems because their time and space complexities are around quadratic in the number of training samples. To overcome this drawback of conventional SVMs, we propose a new confidentmajority voting (CMV) strategy for SVMs in this paper. We call the SVMs using the CMV strategy CMV-SVMs. In CMV-SVMs, a large-scale problem is divided into many smaller and simpler sub-problems in training phase and some confident component classifiers are chosen to vote for the final outcome in test phase. We compare CMV-SVMs with the standard SVMs and parallel SVMs using majority voting (MV-SVMs) on several benchmark problems. The experiments show that the proposed method can significantly reduce the overall time consumed in both training and test. More importantly, it can produce classification accuracy, which is almost the same as that of standard SVMs and better than that of MV-SVMs.