Communications of the ACM
Models of incremental concept formation
Artificial Intelligence
Language learning from texts: mindchanges, limited memory, and monotonicity
Information and Computation
Incremental learning from positive data
Journal of Computer and System Sciences
Incremental concept learning for bounded data mining
Information and Computation
A Machine-Independent Theory of the Complexity of Recursive Functions
Journal of the ACM (JACM)
On the power of incremental learning
Theoretical Computer Science
Machine Learning
Machine Learning
Polynomial Time Inference of Extended Regular Pattern Languages
Proceedings of RIMS Symposium on Software Science and Engineering
A Guided Tour Across the Boundaries of Learning Recursive Languages
Algorithmic Learning for Knowledge-Based Systems, GOSLER Final Report
Learning indexed families of recursive languages from positive data: A survey
Theoretical Computer Science
An incremental class boundary preserving hypersphere classifier
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
A novel local patch framework for fixing supervised learning models
Proceedings of the 21st ACM international conference on Information and knowledge management
Hi-index | 0.00 |
The present study aims at insights into the nature of incremental learning in the context of Gold's model of identification in the limit. With a focus on natural requirements such as consistency and conservativeness, incremental learning is analysed both for learning from positive examples and for learning from positive and negative examples. The results obtained illustrate in which way different consistency and conservativeness demands can affect the capabilities of incremental learners. These results may serve as a first step towards characterising the structure of typical classes learnable incrementally and thus towards elaborating uniform incremental learning methods.