The nature of statistical learning theory
The nature of statistical learning theory
Making large-scale support vector machine learning practical
Advances in kernel methods
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Determining the Number of Clusters/Segments in Hierarchical Clustering/Segmentation Algorithms
ICTAI '04 Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence
Structured large margin machines: sensitive to data distributions
Machine Learning
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Incremental multiple classifier active learning for concept indexing in images and videos
MMM'11 Proceedings of the 17th international conference on Advances in multimedia modeling - Volume Part I
Hi-index | 0.00 |
Although SVM have shown potential and promising performance in classification, they have been limited by speed particularly when the training data set is large. In this paper, we propose an algorithm called the fast SVM classification algorithm based on Karush-Kuhn-Tucker (KKT) conditions. In this algorithm, we remove points that are independent of support vectors firstly in the training process, and then decompose the remaining points into blocks to accelerate the next training. From the theoretical analysis, this algorithm can remarkably reduce the computation complexity and accelerate SVM training. And experiments on both artificial and real datasets demonstrate the efficiency of this algorithm.