The nature of statistical learning theory
The nature of statistical learning theory
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
A Bayesian Approach to Joint Feature Selection and Classifier Design
IEEE Transactions on Pattern Analysis and Machine Intelligence
Gaussian Processes for Ordinal Regression
The Journal of Machine Learning Research
Assessing Approximate Inference for Binary Gaussian Process Classification
The Journal of Machine Learning Research
Expectation propagation for approximate inference in dynamic bayesian networks
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
Expectation-propagation for the generative aspect model
UAI'02 Proceedings of the Eighteenth conference on Uncertainty in artificial intelligence
The evidence framework applied to support vector machines
IEEE Transactions on Neural Networks
Improving multiclass pattern recognition with a co-evolutionary RBFNN
Pattern Recognition Letters
Bayes Machines for binary classification
Pattern Recognition Letters
Sparse Bayes Machines for Binary Classification
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Expert Systems with Applications: An International Journal
Outlier Robust Gaussian Process Classification
SSPR & SPR '08 Proceedings of the 2008 Joint IAPR International Workshop on Structural, Syntactic, and Statistical Pattern Recognition
Validation-based sparse gaussian process classifier design
Neural Computation
Learning Preferences with Hidden Common Cause Relations
ECML PKDD '09 Proceedings of the European Conference on Machine Learning and Knowledge Discovery in Databases: Part I
Thyroid disease diagnosis using Artificial Immune Recognition System (AIRS)
Proceedings of the 2nd International Conference on Interaction Sciences: Information Technology, Culture and Human
Preference learning with extreme examples
IJCAI'09 Proceedings of the 21st international jont conference on Artifical intelligence
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Information Sciences: an International Journal
Variational multinomial logit gaussian process
The Journal of Machine Learning Research
Designing Model Based Classifiers by Emphasizing Soft Targets
Fundamenta Informaticae - Advances in Artificial Intelligence and Applications
Nested expectation propagation for Gaussian process classification
The Journal of Machine Learning Research
Hi-index | 0.15 |
Gaussian process classifiers (GPCs) are Bayesian probabilistic kernel classifiers. In GPCs, the probability of belonging to a certain class at an input location is monotonically related to the value of some latent function at that location. Starting from a Gaussian process prior over this latent function, data are used to infer both the posterior over the latent function and the values of hyperparameters to determine various aspects of the function. Recently, the expectation propagation (EP) approach has been proposed to infer the posterior over the latent function. Based on this work, we present an approximate EM algorithm, the EM-EP algorithm, to learn both the latent function and the hyperparameters. This algorithm is found to converge in practice and provides an efficient Bayesian framework for learning hyperparameters of the kernel. A multiclass extension of the EM-EP algorithm for GPCs is also derived. In the experimental results, the EM-EP algorithms are as good or better than other methods for GPCs or Support Vector Machines (SVMs) with cross-validation.