Results on learnability and the Vapnik-Chervonenkis dimension
Information and Computation
An experimental and theoretical comparison of model selection methods
COLT '95 Proceedings of the eighth annual conference on Computational learning theory
A framework for structural risk minimisation
COLT '96 Proceedings of the ninth annual conference on Computational learning theory
A PAC analysis of a Bayesian estimator
COLT '97 Proceedings of the tenth annual conference on Computational learning theory
Concept learning using complexity regularization
IEEE Transactions on Information Theory
ALT '01 Proceedings of the 12th International Conference on Algorithmic Learning Theory
Pac-bayesian generalisation error bounds for gaussian process classification
The Journal of Machine Learning Research
Generalization error bounds for Bayesian mixture algorithms
The Journal of Machine Learning Research
A Compression Approach to Support Vector Model Selection
The Journal of Machine Learning Research
Sparse Multinomial Logistic Regression: Fast Algorithms and Generalization Bounds
IEEE Transactions on Pattern Analysis and Machine Intelligence
On the estimation of frequent itemsets for data streams: theory and experiments
Proceedings of the 14th ACM international conference on Information and knowledge management
PAC-Bayes risk bounds for sample-compressed Gibbs classifiers
ICML '05 Proceedings of the 22nd international conference on Machine learning
Effective transductive learning via objective model selection
Pattern Recognition Letters
Mining evolving data streams for frequent patterns
Pattern Recognition
Intelligent Data Analysis - Knowlegde Discovery from Data Streams
Multi-classification by categorical features via clustering
Proceedings of the 25th international conference on Machine learning
Lexicon acquisition for dialectal Arabic using transductive learning
EMNLP '06 Proceedings of the 2006 Conference on Empirical Methods in Natural Language Processing
Explicit learning curves for transduction and application to clustering and compression algorithms
Journal of Artificial Intelligence Research
The Journal of Machine Learning Research
ANNPR'06 Proceedings of the Second international conference on Artificial Neural Networks in Pattern Recognition
Hi-index | 0.00 |
This paper gives PAC guarantees for “Bayesian” algorithms—algorithms that optimize risk minimization expressions involving aprior probability and a likelihood for the training data.PAC-Bayesian algorithms are motivated by a desire to provide aninformative prior encoding information about the expected experimentalsetting but still having PAC performance guarantees over allIID settings. The PAC-Bayesian theorems given here apply to anarbitrary prior measure on an arbitrary concept space. These theoremsprovide an alternative to the use of VC dimension in proving PACbounds for parameterized concepts.