epsilon-SSVR: A Smooth Support Vector Machine for epsilon-Insensitive Regression
IEEE Transactions on Knowledge and Data Engineering
Core Vector Regression for very large regression problems
ICML '05 Proceedings of the 22nd international conference on Machine learning
Separating hypersurfaces of SVMs in input spaces
Pattern Recognition Letters
Expert Systems with Applications: An International Journal
TS-fuzzy system-based support vector regression
Fuzzy Sets and Systems
Adaptive local hyperplane for regression tasks
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
An effective method of pruning support vector machine classifiers
IEEE Transactions on Neural Networks
On Lagrangian support vector regression
Expert Systems with Applications: An International Journal
Convergence improvement of active set training for support vector regressors
ICANN'10 Proceedings of the 20th international conference on Artificial neural networks: Part II
A new smooth support vector regression based on ε-insensitive logistic loss function
ICNC'05 Proceedings of the First international conference on Advances in Natural Computation - Volume Part I
Support vector machines training data selection using a genetic algorithm
SSPR'12/SPR'12 Proceedings of the 2012 Joint IAPR international conference on Structural, Syntactic, and Statistical Pattern Recognition
Asymptotic error bounds for kernel-based Nyström low-rank approximation matrices
Journal of Multivariate Analysis
Inductive manifold learning using structured support vector machine
Pattern Recognition
Finite Newton method for implicit Lagrangian support vector regression
International Journal of Knowledge-based and Intelligent Engineering Systems
Smooth Newton method for implicit Lagrangian twin support vector regression
International Journal of Knowledge-based and Intelligent Engineering Systems
TS-fuzzy modeling based on ε-insensitive smooth support vector regression
Journal of Intelligent & Fuzzy Systems: Applications in Engineering and Technology
Hi-index | 0.01 |
This paper presents active set support vector regression (ASVR), a new active set strategy to solve a straightforward reformulation of the standard support vector regression problem. This new algorithm is based on the successful ASVM algorithm for classification problems, and consists of solving a finite number of linear equations with a typically large dimensionality equal to the number of points to be approximated. However, by making use of the Sherman-Morrison-Woodbury formula, a much smaller matrix of the order of the original input space is inverted at each step. The algorithm requires no specialized quadratic or linear programming code, but merely a linear equation solver which is publicly available. ASVR is extremely fast, produces comparable generalization error to other popular algorithms, and is available on the web for download.