The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Machine Learning
Accuracy and Stability of Numerical Algorithms
Accuracy and Stability of Numerical Algorithms
Choosing Multiple Parameters for Support Vector Machines
Machine Learning
On the Nyström Method for Approximating a Gram Matrix for Improved Kernel-Based Learning
The Journal of Machine Learning Research
Bounds on Error Expectation for Support Vector Machines
Neural Computation
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Preventing Over-Fitting during Model Selection via Bayesian Regularisation of the Hyper-Parameters
The Journal of Machine Learning Research
Model Selection: Beyond the Bayesian/Frequentist Divide
The Journal of Machine Learning Research
On Over-fitting in Model Selection and Subsequent Selection Bias in Performance Evaluation
The Journal of Machine Learning Research
Hi-index | 0.00 |
Model selection is critical to least squares support vector machine (LSSVM). A major problem of existing model selection approaches is that a standard LSSVM needs to be solved with O (n 3) complexity for each iteration, where n is the number of training examples. In this paper, we propose an approximate approach to model selection of LSSVM. We use Nyström method to approximate a given kernel matrix by a low rank representation of it. With such approximation, we first design an efficient LSSVM algorithm and theoretically analyze the effect of kernel matrix approximation on the decision function of LSSVM. Based on the matrix approximation error bound of Nyström method, we derive a model approximation error bound, which is a theoretical guarantee of approximate model selection. We finally present an approximate model selection scheme, whose complexity is lower than the previous approaches. Experimental results on benchmark datasets demonstrate the effectiveness of approximate model selection.