A Confident Majority Voting Strategy for Parallel and Modular Support Vector Machines

  • Authors:
  • Yi-Min Wen;Bao-Liang Lu

  • Affiliations:
  • Department of Computer Science and Engineering, Shanghai Jiao Tong University, 800 Dong Chuan Road, Shanghai 200240, China and Hunan Industry Polytechnic, Changsha 410208, China;Department of Computer Science and Engineering, Shanghai Jiao Tong University, 800 Dong Chuan Road, Shanghai 200240, China

  • Venue:
  • ISNN '07 Proceedings of the 4th international symposium on Neural Networks: Advances in Neural Networks, Part III
  • Year:
  • 2007

Quantified Score

Hi-index 0.00

Visualization

Abstract

Support vector machines (SVMs) have been accepted as a fashionable method in machine learning community, but they cannot be easily scaled to handle large scale problems because their time and space complexities are around quadratic in the number of training samples. To overcome this drawback of conventional SVMs, we propose a new confidentmajority voting (CMV) strategy for SVMs in this paper. We call the SVMs using the CMV strategy CMV-SVMs. In CMV-SVMs, a large-scale problem is divided into many smaller and simpler sub-problems in training phase and some confident component classifiers are chosen to vote for the final outcome in test phase. We compare CMV-SVMs with the standard SVMs and parallel SVMs using majority voting (MV-SVMs) on several benchmark problems. The experiments show that the proposed method can significantly reduce the overall time consumed in both training and test. More importantly, it can produce classification accuracy, which is almost the same as that of standard SVMs and better than that of MV-SVMs.