Bayesian methods for adaptive models
Bayesian methods for adaptive models
Finding nearest neighbors in growth-restricted metrics
STOC '02 Proceedings of the thiry-fourth annual ACM symposium on Theory of computing
Sparse Greedy Matrix Approximation for Machine Learning
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Bayesian learning for neural networks
Bayesian learning for neural networks
Semi-Supervised Learning on Riemannian Manifolds
Machine Learning
Sample selection for statistical grammar induction
EMNLP '00 Proceedings of the 2000 Joint SIGDAT conference on Empirical methods in natural language processing and very large corpora: held in conjunction with the 38th Annual Meeting of the Association for Computational Linguistics - Volume 13
Semi-definite Manifold Alignment
ECML '07 Proceedings of the 18th European conference on Machine Learning
Pseudo-aligned multilingual corpora
IJCAI'07 Proceedings of the 20th international joint conference on Artifical intelligence
Semi-supervised Elastic net for pedestrian counting
Pattern Recognition
Transductive gaussian process regression with automatic model selection
ECML'06 Proceedings of the 17th European conference on Machine Learning
Efficiency investigation of manifold matching for text document classification
Pattern Recognition Letters
Hi-index | 0.01 |
Gaussian fields (GF) have recently received considerable attention for dimension reduction and semi-supervised classification. In this paper we show how the GF framework can be used for semi-supervised regression on high-dimensional data. We propose an active learning strategy based on entropy minimization and a maximum likelihood model selection method. Furthermore, we show how a recent generalization of the LLE algorithm for correspondence learning can be cast into the GF framework, which obviates the need to choose a representation dimensionality.