The nature of statistical learning theory
The nature of statistical learning theory
Least Squares Support Vector Machine Classifiers
Neural Processing Letters
Ridge Regression Learning Algorithm in Dual Variables
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Empirical Bayes for Learning to Learn
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
A Hierarchical Bayes Model of Primary and Secondary Demand
Marketing Science
Asymptotic behaviors of support vector machines with Gaussian kernel
Neural Computation
Kernel partial least squares regression in reproducing kernel hilbert space
The Journal of Machine Learning Research
Task clustering and gating for bayesian multitask learning
The Journal of Machine Learning Research
Benchmarking Least Squares Support Vector Machine Classifiers
Machine Learning
Regularized multi--task learning
Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining
Learning Multiple Tasks with Kernel Methods
The Journal of Machine Learning Research
Multi-output regression on the output manifold
Pattern Recognition
Least squares one-class support vector machine
Pattern Recognition Letters
Hi-index | 0.10 |
Multi-output regression aims at learning a mapping from a multivariate input feature space to a multivariate output space. Despite its potential usefulness, the standard formulation of the least-squares support vector regression machine (LS-SVR) cannot cope with the multi-output case. The usual procedure is to train multiple independent LS-SVR, thus disregarding the underlying (potentially nonlinear) cross relatedness among different outputs. To address this problem, inspired by the multi-task learning methods, this study proposes a novel approach, Multi-output LS-SVR (MLS-SVR), in multi-output setting. Furthermore, a more efficient training algorithm is also given. Finally, extensive experimental results validate the effectiveness of the proposed approach.