Regularization theory and neural networks architectures
Neural Computation
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Semiparametric support vector and linear programming machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
EPIA'05 Proceedings of the 12th Portuguese conference on Progress in Artificial Intelligence
Learning SVM with weighted maximum margin criterion for classification of imbalanced data
Mathematical and Computer Modelling: An International Journal
Hi-index | 0.98 |
We propose a partially linear version of the SVMs, PL-SVM, which uses a kernel composed of a linear kernel in which a number of variables participate, and a nonlinear kernel which affects the other variables. This approach enables a linear component in a subset of variables to be modeled. The resulting models are true SVMs and so existing learning algorithms can be used. This approach can be applied to other kernel methods such as kernel discriminant analysis, kernel principal components analysis, etc. We used an autoregressive PL-SVM with a view to predicting monthly movement in a mine slope with an impact on the safety of the mining operation. In our problem, the PL-SVM improves on the results of other autoregressive approaches, including those for the classical non-parametric partially linear models.