Computational geometry: an introduction
Computational geometry: an introduction
Learning regular sets from queries and counterexamples
Information and Computation
Nonobtuse triangulation of polygons
Discrete & Computational Geometry
Automatic Pattern Recognition: A Study of the Probability of Error
IEEE Transactions on Pattern Analysis and Machine Intelligence
Instance-Based Learning Algorithms
Machine Learning
A Nearest Hyperrectangle Learning Method
Machine Learning
Polynomial-size nonobtuse triangulation of polygons
SCG '91 Proceedings of the seventh annual symposium on Computational geometry
COLT '91 Proceedings of the fourth annual workshop on Computational learning theory
C4.5: programs for machine learning
C4.5: programs for machine learning
A geometric framework for machine learning
A geometric framework for machine learning
Linear-size nonobtuse triangulation of polygons
SCG '94 Proceedings of the tenth annual symposium on Computational geometry
Computational Geometry: Theory and Applications
Learning with Nested Generalized Exemplars
Learning with Nested Generalized Exemplars
Edge Insertion for Optional Triangulations
LATIN '92 Proceedings of the 1st Latin American Symposium on Theoretical Informatics
Meta-classifiers and selective superiority
IEA/AIE '00 Proceedings of the 13th international conference on Industrial and engineering applications of artificial intelligence and expert systems: Intelligent problem solving: methodologies and approaches
Analysis of Case-Based Representability of Boolean Functions by Monotone Theory
ALT '98 Proceedings of the 9th International Conference on Algorithmic Learning Theory
Effects of domain characteristics on instance-based learning algorithms
Theoretical Computer Science - Selected papers in honour of Setsuo Arikawa
New voting strategies designed for the classification of nucleic sequences
Knowledge and Information Systems
Hi-index | 0.14 |
In this paper we propose a theoretical model for analysis of classification methods, in which the teacher knows the classification algorithm and chooses examples in the best way possible. We apply this model using the nearest-neighbor learning algorithm, and develop upper and lower bounds on sample complexity for several different concept classes. For some concept classes, the sample complexity turns out to be exponential even using this best-case model, which implies that the concept class is inherently difficult for the NN algorithm. We identify several geometric properties that make learning certain concepts relatively easy. Finally we discuss the relation of our work to helpful teacher models, its application to decision tree learning algorithms, and some of its implications for current experimental work.