Generalized Hermite interpolation via matrix-valued conditionally positive definite functions
Mathematics of Computation
The nature of statistical learning theory
The nature of statistical learning theory
Machine Learning - Special issue on inductive transfer
Bayesian Classification With Gaussian Processes
IEEE Transactions on Pattern Analysis and Machine Intelligence
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond
A Generalized Representer Theorem
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
Learning to learn with the informative vector machine
ICML '04 Proceedings of the twenty-first international conference on Machine learning
Some Properties of Regularized Kernel Methods
The Journal of Machine Learning Research
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Advances in Neural Information Processing Systems 17: Proceedings of the 2004 Conference (Bradford Books)
Learning Gaussian processes from multiple tasks
ICML '05 Proceedings of the 22nd international conference on Machine learning
On Learning Vector-Valued Functions
Neural Computation
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Pattern Recognition and Machine Learning (Information Science and Statistics)
Pattern Recognition and Machine Learning (Information Science and Statistics)
A Unifying View of Sparse Approximate Gaussian Process Regression
The Journal of Machine Learning Research
Multi-Task Learning for Classification with Dirichlet Process Priors
The Journal of Machine Learning Research
Characterizing the Function Space for Bayesian Kernel Models
The Journal of Machine Learning Research
Gaussian Process Dynamical Models for Human Motion
IEEE Transactions on Pattern Analysis and Machine Intelligence
IPSN '08 Proceedings of the 7th international conference on Information processing in sensor networks
An Algorithm for Transfer Learning in a Heterogeneous Environment
ECML PKDD '08 Proceedings of the 2008 European Conference on Machine Learning and Knowledge Discovery in Databases - Part I
The Journal of Machine Learning Research
Gaussian process modelling of latent chemical species
Bioinformatics
Convex multi-task feature learning
Machine Learning
Joint covariate selection and joint subspace selection for multiple classification problems
Statistics and Computing
IEEE Transactions on Knowledge and Data Engineering
Analysis of some methods for reduced rank gaussian process regression
Switching and Learning in Feedback Systems
Transformations of gaussian process priors
Proceedings of the First international conference on Deterministic and Statistical Methods in Machine Learning
Bayesian Multitask Classification With Gaussian Process Priors
IEEE Transactions on Neural Networks - Part 1
Learning output kernels for multi-task problems
Neurocomputing
Hi-index | 0.00 |
Kernel methods are among the most popular techniques in machine learning. From a regularization perspective they play a central role in regularization theory as they provide a natural choice for the hypotheses space and the regularization functional through the notion of reproducing kernel Hilbert spaces. From a probabilistic perspective they are the key in the context of Gaussian processes, where the kernel function is known as the covariance function. Traditionally, kernel methods have been used in supervised learning problems with scalar outputs and indeed there has been a considerable amount of work devoted to designing and learning kernels. More recently there has been an increasing interest in methods that deal with multiple outputs, motivated partially by frameworks like multitask learning. In this monograph, we review different methods to design or learn valid kernel functions for multiple outputs, paying particular attention to the connection between probabilistic and functional methods.