Communications of the ACM - Special issue on parallelism
Instance-Based Learning Algorithms
Machine Learning
Induction of one-level decision trees
ML92 Proceedings of the ninth international workshop on Machine learning
Selecting typical instances in instance-based learning
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Improved heterogeneous distance functions
Journal of Artificial Intelligence Research
A study of cross-validation and bootstrap for accuracy estimation and model selection
IJCAI'95 Proceedings of the 14th international joint conference on Artificial intelligence - Volume 2
Discovering Useful Concept Prototypes for Classification Based on Filtering and Abstraction
IEEE Transactions on Pattern Analysis and Machine Intelligence
Prototype Generation Based on Instance Filtering and Averaging
PADKK '00 Proceedings of the 4th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Current Issues and New Applications
Weighted Instance Typicality Search (WITS): A nearest neighbor data reduction algorithm
Intelligent Data Analysis
Applying clustering to the classification problem
AAAI'97/IAAI'97 Proceedings of the fourteenth national conference on artificial intelligence and ninth conference on Innovative applications of artificial intelligence
Data reduction for instance-based learning using entropy-based partitioning
ICCSA'06 Proceedings of the 2006 international conference on Computational Science and Its Applications - Volume Part III
Hi-index | 0.00 |
The minimum-distance classifier summarizes each class with a prototype and then uses a nearest neighbor approach for classification. Three drawbacks of the original minimum-distance classifier are its inability to work with symbolic attributes, weigh attributes, and learn more than a single prototype for each class. The proposed solutions to these problems include defining the mean for symbolic attributes, providing a weighting metric, and learning several possible prototypes for each class. The learning algorithm developed to tackle these problems, SNMC, increases classification accuracy by 10% over the original minimum-distance classifier and has a higher average generalization accuracy than both C4.5 and PEBLS on 20 domains from the UCI data repository.