Communications of the ACM
Information Processing Letters
Quantifying inductive bias: AI learning algorithms and Valiant's learning framework
Artificial Intelligence
A general lower bound on the number of examples needed for learning
Information and Computation
Learnability and the Vapnik-Chervonenkis dimension
Journal of the ACM (JACM)
On the necessity of Occam algorithms
STOC '90 Proceedings of the twenty-second annual ACM symposium on Theory of computing
Computational learning theory: an introduction
Computational learning theory: an introduction
Linear approximation of shortest superstrings
Journal of the ACM (JACM)
Predicting {0, 1}-functions on randomly drawn points
Information and Computation
An introduction to Kolmogorov complexity and its applications (2nd ed.)
An introduction to Kolmogorov complexity and its applications (2nd ed.)
Towards Representation Independence in PAC Learning
AII '89 Proceedings of the International Workshop on Analogical and Inductive Inference
Towards a DNA sequencing theory (learning a string)
SFCS '90 Proceedings of the 31st Annual Symposium on Foundations of Computer Science
Hi-index | 0.00 |
We provide a new representation-independent formulation of Occam's razor theorem, based on Kolmogorov complexity. This new formulation allows us to: - Obtain better sample complexity than both length-based [4] and VC-based[3] versions of Occam's razor theorem, in many applications. - Achieve a sharper reverse of Occam's razor theorem than that of [5]. Specifically, we weaken the assumptions made in [5] and extend the reverse to superpolynomial running times.