Machine Learning
Making large-scale support vector machine learning practical
Advances in kernel methods
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Exact simplification of support vector solutions
The Journal of Machine Learning Research
Predictive low-rank decomposition for kernel methods
ICML '05 Proceedings of the 22nd international conference on Machine learning
Building Sparse Large Margin Classifiers
ICML '05 Proceedings of the 22nd international conference on Machine learning
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
Building Support Vector Machines with Reduced Classifier Complexity
The Journal of Machine Learning Research
Twin Support Vector Machines for Pattern Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Simpler core vector machines with enclosing balls
Proceedings of the 24th international conference on Machine learning
Computational Geometry: Theory and Applications
Sparse kernel SVMs via cutting-plane training
Machine Learning
Sparse approximation through boosting for learning large scale kernel machines
IEEE Transactions on Neural Networks
A study on reduced support vector machines
IEEE Transactions on Neural Networks
Generalized Core Vector Machines
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Fast Sparse Approximation for Least Squares Support Vector Machine
IEEE Transactions on Neural Networks
Pruning Support Vector Machines Without Altering Performances
IEEE Transactions on Neural Networks
Hi-index | 0.01 |
Controlling the sparsity of a classifier is a key to train SVM efficiently on very large scale problems. This paper explores building SVM classifier on the fitting-plane of each class of data, which captures the distributing trend of the corresponding class of data. The newly developed plane-fitting model can be solved by core set methods, and the SVM is trained only on the core sets which are small subsets of the original data. The computing complexity of the proposed algorithm is up bounded by O(1/@e). Experimental results show that the new algorithm scales better than SVMperf and CVM/BVM, while their predicting accuracies are almost comparable.