Topics in matrix analysis
The nature of statistical learning theory
The nature of statistical learning theory
Nonlinear component analysis as a kernel eigenvalue problem
Neural Computation
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Comparison of approximate methods for handling hyperparameters
Neural Computation
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Optimal control by least squares support vector machines
Neural Networks
Neural Networks for Pattern Recognition
Neural Networks for Pattern Recognition
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Evaluation of gaussian processes and other methods for non-linear regression
Evaluation of gaussian processes and other methods for non-linear regression
An Expectation-Maximization Approach to Nonlinear Component Analysis
Neural Computation
Generalized Discriminant Analysis Using a Kernel Approach
Neural Computation
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
The evidence framework applied to support vector machines
IEEE Transactions on Neural Networks
IEEE Transactions on Neural Networks
Multiclass LS-SVMs: Moderated Outputs and Coding-Decoding Schemes
Neural Processing Letters
SMO algorithm for least-squares SVM formulations
Neural Computation
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
IEEE Transactions on Pattern Analysis and Machine Intelligence
2005 Special Issue: Constructing Bayesian formulations of sparse kernel learning methods
Neural Networks - 2005 Special issue: IJCNN 2005
The theoretical analysis of FDA and applications
Pattern Recognition
A process model to develop an internal rating system: sovereign credit ratings
Decision Support Systems
Kernel least-squares models using updates of the pseudoinverse
Neural Computation
Artificial Intelligence in Medicine
Optimising Kernel Parameters and Regularisation Coefficients for Non-linear Discriminant Analysis
The Journal of Machine Learning Research
Expert Systems with Applications: An International Journal
Artificial Intelligence in Medicine
Sparse multinomial kernel discriminant analysis (sMKDA)
Pattern Recognition
Model selection for the LS-SVM. Application to handwriting recognition
Pattern Recognition
Updates for nonlinear discriminants
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Recursive Bayesian linear regression for adaptive classification
IEEE Transactions on Signal Processing
Proceedings of the 2009 conference on Computational Intelligence and Bioengineering: Essays in Memory of Antonina Starita
Kernel Approaches to Unsupervised and Supervised Machine Learning
PCM '09 Proceedings of the 10th Pacific Rim Conference on Multimedia: Advances in Multimedia Information Processing
Help-training semi-supervised LS-SVM
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
FSKD'09 Proceedings of the 6th international conference on Fuzzy systems and knowledge discovery - Volume 5
A meta-heuristic approach for improving the accuracy in some classification algorithms
Computers and Operations Research
Regularized Discriminant Analysis, Ridge Regression and Beyond
The Journal of Machine Learning Research
Extended Bayesian framework for automatic tuning of kernel data-mining methods
ACS'06 Proceedings of the 6th WSEAS international conference on Applied computer science
Multiclass Kernel-Imbedded Gaussian Processes for Microarray Data Analysis
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
PDA-SVM Hybrid: A Unified Model for Kernel-Based Supervised Classification
Journal of Signal Processing Systems
Multiple parameter selection for LS-SVM using smooth leave-one-out error
ISNN'05 Proceedings of the Second international conference on Advances in Neural Networks - Volume Part I
The use of stability principle for kernel determination in relevance vector machines
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Non-sparse multiple kernel fisher discriminant analysis
The Journal of Machine Learning Research
Preoperative prediction of malignancy of ovarian tumors using least squares support vector machines
Artificial Intelligence in Medicine
Class proximity measures - Dissimilarity-based classification and display of high-dimensional data
Journal of Biomedical Informatics
Computer Methods and Programs in Biomedicine
Asymmetric least squares support vector machine classifiers
Computational Statistics & Data Analysis
Fast sparse approximation of extreme learning machine
Neurocomputing
Hi-index | 0.01 |
The Bayesian evidence framework has been successfully applied to the design of multilayer perceptrons (MLPs) in the work of MacKay. Nevertheless, the training of MLPs suffers from drawbacks like the nonconvex optimization problem and the choice of the number of hidden units. In support vector machines (SVMs) for classification, as introduced by Vapnik, a nonlinear decision boundary is obtained by mapping the input vector first in a nonlinear way to a high-dimensional kernel-induced feature space in which a linear large margin classifier is constructed. Practical expressions are formulated in the dual space in terms of the related kernel function, and the solution follows from a (convex) quadratic programming (QP) problem. In least-squares SVMs (LS-SVMs), the SVM problem formulation is modified by introducing a least-squares cost function and equality instead of inequality constraints, and the solution follows from a linear system in the dual space. Implicitly, the least-squares formulation corresponds to a regression formulation and is also related to kernel Fisher discriminant analysis. The least-squares regression formulation has advantages for deriving analytic expressions in a Bayesian evidence framework, in contrast to the classification formulations used, for example, in gaussian processes (GPs). The LS-SVM formulation has clear primal-dual interpretations, and without the bias term, one explicitly constructs a model that yields the same expressions as have been obtained with GPs for regression. In this article, the Bayesian evidence framework is combined with the LS-SVM classifier formulation. Starting from the feature space formulation, analytic expressions are obtained in the dual space on the different levels of Bayesian inference, while posterior class probabilities are obtained by marginalizing over the model parameters. Empirical results obtained on 10 public domain data sets show that the LS-SVM classifier designed within the bayesian evidence framework consistently yields good generalization performances.