Neural Computation
The nature of statistical learning theory
The nature of statistical learning theory
Advances in kernel methods: support vector learning
Advances in kernel methods: support vector learning
Support vector machines, reproducing kernel Hilbert spaces, and randomized GACV
Advances in kernel methods
Prediction with Gaussian processes: from linear regression to linear prediction and beyond
Learning in graphical models
Dynamically adapting kernels in support vector machines
Proceedings of the 1998 conference on Advances in neural information processing systems II
Shrinking the tube: a new support vector regression algorithm
Proceedings of the 1998 conference on Advances in neural information processing systems II
An introduction to support Vector Machines: and other kernel-based learning methods
An introduction to support Vector Machines: and other kernel-based learning methods
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
On the Noise Model of Support Vector Machine Regression
On the Noise Model of Support Vector Machine Regression
Moderating the outputs of support vector machine classifiers
IEEE Transactions on Neural Networks
Optimization of the SVM Kernels Using an Empirical Error Minimization Scheme
SVM '02 Proceedings of the First International Workshop on Pattern Recognition with Support Vector Machines
Bayesian trigonometric support vector classifier
Neural Computation
Neural Networks - 2005 Special issue: IJCNN 2005
Bayesian Gaussian Process Classification with the EM-EP Algorithm
IEEE Transactions on Pattern Analysis and Machine Intelligence
A kernel optimization method based on the localized kernel Fisher criterion
Pattern Recognition
An Information Criterion for Variable Selection in Support Vector Machines
The Journal of Machine Learning Research
Recursive Bayesian linear regression for adaptive classification
IEEE Transactions on Signal Processing
Maximum Entropy Discrimination Markov Networks
The Journal of Machine Learning Research
The entire quantile path of a risk-agnostic SVM classifier
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Applying Bayesian approach to decision tree
ICIC'06 Proceedings of the 2006 international conference on Intelligent computing: Part II
Computational Statistics & Data Analysis
Bayesian decision theory for support vector machines: Imbalance measurement and feature optimization
Expert Systems with Applications: An International Journal
Bayesian Generalized Kernel Mixed Models
The Journal of Machine Learning Research
Bayesian kernel projections for classification of high dimensional data
Statistics and Computing
Support vector machines using Bayesian-based approach in the issue of unbalanced classifications
Expert Systems with Applications: An International Journal
The structured elastic net for quantile regression and support vector classification
Statistics and Computing
Predicting nonstationary time series with multi-scale gaussian processes model
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Bayesian nonlinear regression for large p small n problems
Journal of Multivariate Analysis
Computational Statistics & Data Analysis
Editors Choice Article: I2VM: Incremental import vector machines
Image and Vision Computing
KCMAC-BYY: Kernel CMAC using Bayesian Ying-Yang learning
Neurocomputing
Probabilistic support vector machines for classification of noise affected data
Information Sciences: an International Journal
A sparse support vector machine classifier with nonparametric discriminants
ICANN'12 Proceedings of the 22nd international conference on Artificial Neural Networks and Machine Learning - Volume Part II
Texture-based descriptors for writer identification and verification
Expert Systems with Applications: An International Journal
Coherence functions with applications in large-margin classification methods
The Journal of Machine Learning Research
Advances in Artificial Intelligence
Hi-index | 0.01 |
I describe a framework for interpreting Support Vector Machines (SVMs) as maximum a posteriori (MAP) solutions to inference problems with Gaussian Process priors. This probabilistic interpretation can provide intuitive guidelines for choosing a ‘good’ SVM kernel. Beyond this, it allows Bayesian methods to be used for tackling two of the outstanding challenges in SVM classification: how to tune hyperparameters—the misclassification penalty C, and any parameters specifying the ernel—and how to obtain predictive class probabilities rather than the conventional deterministic class label predictions. Hyperparameters can be set by maximizing the evidence; I explain how the latter can be defined and properly normalized. Both analytical approximations and numerical methods (Monte Carlo chaining) for estimating the evidence are discussed. I also compare different methods of estimating class probabilities, ranging from simple evaluation at the MAP or at the posterior average to full averaging over the posterior. A simple toy application illustrates the various concepts and techniques.