A family of algorithms for approximate bayesian inference
A family of algorithms for approximate bayesian inference
Gaussian Processes for Ordinal Regression
The Journal of Machine Learning Research
New approaches to support vector ordinal regression
ICML '05 Proceedings of the 22nd international conference on Machine learning
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Gaussian Processes for Machine Learning (Adaptive Computation and Machine Learning)
Statistical Comparisons of Classifiers over Multiple Data Sets
The Journal of Machine Learning Research
Validation-based sparse gaussian process classifier design
Neural Computation
Ordinal regression with sparse Bayesian
ICIC'09 Proceedings of the Intelligent computing 5th international conference on Emerging intelligent computing technology and applications
Kernel Discriminant Learning for Ordinal Regression
IEEE Transactions on Knowledge and Data Engineering
Hi-index | 0.00 |
This paper proposes a sparse modeling approach to solve ordinal regression problems using Gaussian processes (GP). Designing a sparse GP model is important from training time and inference time viewpoints. We first propose a variant of the Gaussian process ordinal regression (GPOR) approach, leave-one-out GPOR (LOO-GPOR). It performs model selection using the leave-one-out cross-validation (LOO-CV) technique. We then provide an approach to design a sparse model for GPOR. The sparse GPOR model reduces computational time and storage requirements. Further, it provides faster inference. We compare the proposed approaches with the state-of-the-art GPOR approach on some benchmark data sets. Experimental results show that the proposed approaches are competitive.