Laplacian Eigenmaps for dimensionality reduction and data representation
Neural Computation
Semi-Supervised Learning on Riemannian Manifolds
Machine Learning
From graphs to manifolds – weak and strong pointwise consistency of graph laplacians
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Towards a theoretical foundation for laplacian-based manifold methods
COLT'05 Proceedings of the 18th annual conference on Learning Theory
Manifold-adaptive dimension estimation
Proceedings of the 24th international conference on Machine learning
A tutorial on spectral clustering
Statistics and Computing
Covariate Shift Adaptation by Importance Weighted Cross Validation
The Journal of Machine Learning Research
On Learning with Integral Operators
The Journal of Machine Learning Research
Hearing the clusters of a graph: A distributed algorithm
Automatica (Journal of IFAC)
Hi-index | 0.00 |
The regularization functional induced by the graph Laplacian of a random neighborhood graph based on the data is adaptive in two ways. First it adapts to an underlying manifold structure and second to the density of the data-generating probability measure. We identify in this paper the limit of the regularizer and show uniform convergence over the space of Hölder functions. As an intermediate step we derive upper bounds on the covering numbers of Hölder functions on compact Riemannian manifolds, which are of independent interest for the theoretical analysis of manifold-based learning methods.