Learning Boolean Functions in an Infinite Attribute Space
Machine Learning
Efficient learning with virtual threshold gates
Information and Computation
A Winnow-Based Approach to Context-Sensitive Spelling Correction
Machine Learning - Special issue on natural language learning
Artificial Intelligence
Machine Learning
Agnostic learning of geometric patterns
Journal of Computer and System Sciences
Formal Concept Analysis: Mathematical Foundations
Formal Concept Analysis: Mathematical Foundations
Foundations of Databases: The Logical Level
Foundations of Databases: The Logical Level
Learning Conjunctive Concepts in Structural Domains
Machine Learning
Learning to recognize three-dimensional objects
Neural Computation
Learning Horn Expressions with LogAn-H
ICML '00 Proceedings of the Seventeenth International Conference on Machine Learning
On-line Algorithms in Machine Learning
Developments from a June 1996 seminar on Online algorithms: the state of the art
Efficiently Approximating Weighted Sums with Exponentially Many Terms
COLT '01/EuroCOLT '01 Proceedings of the 14th Annual Conference on Computational Learning Theory and and 5th European Conference on Computational Learning Theory
Relational learning via propositional algorithms: an information extraction case study
IJCAI'01 Proceedings of the 17th international joint conference on Artificial intelligence - Volume 2
Hi-index | 0.00 |
Online learning algorithms such as Winnow have received much attention in Machine Learning. Their performance degrades only logarithmically with the input dimension, making them useful in large spaces such as relational theories. However, online first-order learners are intrinsically limited by a computational barrier: even in the finite, function-free case, the number of possible features grows exponentially with the number of first-order atoms generated from the vocabulary. To circumvent this issue, we exploit the paradigm of closure-based learning which allows the learner to focus on the features that lie in the closure space generated from the examples which have lead to a mistake. Based on this idea, we develop an online algorithm for learning theories formed by disjunctions of existentially quantified conjunctions of atoms. In this setting, we show that the number of mistakes depends only logarithmically on the number of features. Furthermore, the computational cost is essentially bounded by the size of the closure lattice.