The Journal of Machine Learning Research
Rademacher and gaussian complexities: risk bounds and structural results
The Journal of Machine Learning Research
Generalization error bounds for Bayesian mixture algorithms
The Journal of Machine Learning Research
Learning the Kernel Matrix with Semidefinite Programming
The Journal of Machine Learning Research
Semi-Supervised Learning on Riemannian Manifolds
Machine Learning
Estimation of Dependences Based on Empirical Data: Springer Series in Statistics (Springer Series in Statistics)
ICML '05 Proceedings of the 22nd international conference on Machine learning
Effective transductive learning via objective model selection
Pattern Recognition Letters
An analysis of graph cut size for transductive learning
ICML '06 Proceedings of the 23rd international conference on Machine learning
Explicit learning curves for transduction and application to clustering and compression algorithms
Journal of Artificial Intelligence Research
Semi-Supervised Learning
COLT'06 Proceedings of the 19th annual conference on Learning Theory
Spectral clustering and transductive learning with multiple views
Proceedings of the 24th international conference on Machine learning
Large margin vs. large volume in transductive learning
Machine Learning
PAKDD'10 Proceedings of the 14th Pacific-Asia conference on Advances in Knowledge Discovery and Data Mining - Volume Part II
Hi-index | 0.00 |
We present data-dependent error bounds for transductive learning based on transductive Rademacher complexity. For specific algorithms we provide bounds on their Rademacher complexity based on their "unlabeled-labeled" decomposition. This decomposition technique applies to many current and practical graph-based algorithms. Finally, we present a new PAC-Bayesian bound for mixtures of transductive algorithms based on our Rademacher bounds.