Kernel Methods for Pattern Analysis
Kernel Methods for Pattern Analysis
A tutorial on support vector regression
Statistics and Computing
Learning structured prediction models: a large margin approach
ICML '05 Proceedings of the 22nd international conference on Machine learning
Structured Prediction, Dual Extragradient and Bregman Projections
The Journal of Machine Learning Research
Manifold Regularization: A Geometric Framework for Learning from Labeled and Unlabeled Examples
The Journal of Machine Learning Research
Multi-output nonparametric regression
EPIA'05 Proceedings of the 12th Portuguese conference on Progress in Artificial Intelligence
Tree ensembles for predicting structured outputs
Pattern Recognition
Two online dam safety monitoring models based on the process of extracting environmental effect
Advances in Engineering Software
Multi-output least-squares support vector regression machines
Pattern Recognition Letters
Hi-index | 0.01 |
Multi-output regression aims at learning a mapping from an input feature space to a multivariate output space. Previous algorithms define the loss functions using a fixed global coordinate of the output space, which is equivalent to assuming that the output space is a whole Euclidean space with a dimension equal to the number of the outputs. So the underlying structure of the output space is completely ignored. In this paper, we consider the output space as a Riemannian submanifold to incorporate its geometric structure into the regression process. To this end, we propose a novel mechanism, called locally linear transformation (LLT), to define the loss functions on the output manifold. In this way, currently existing regression algorithms can be improved. In particular, we propose an algorithm under the support vector regression framework. Our experimental results on synthetic and real-life data are satisfactory.