Improved Boosting Algorithms Using Confidence-rated Predictions
Machine Learning - The Eleventh Annual Conference on computational Learning Theory
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Three learning phases for radial-basis-function networks
Neural Networks
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Heterogeneous Forests of Decision Trees
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Knowledge-Based Clustering: From Data to Information Granules
Knowledge-Based Clustering: From Data to Information Granules
IEEE Intelligent Systems
Image Classification by Histogram Features Created with Learning Vector Quantization
ICANN '08 Proceedings of the 18th international conference on Artificial Neural Networks, Part I
Adaptive relevance matrices in learning vector quantization
Neural Computation
A prototype-based rule inference system incorporating linear functions
Fuzzy Sets and Systems
Prototype-based threshold rules
ICONIP'06 Proceedings of the 13th international conference on Neural information processing - Volume Part III
Geometric decision rules for instance-based learning problems
PReMI'05 Proceedings of the First international conference on Pattern Recognition and Machine Intelligence
The condensed nearest neighbor rule (Corresp.)
IEEE Transactions on Information Theory
A new methodology of extraction, optimization and application of crisp and fuzzy logical rules
IEEE Transactions on Neural Networks
Reduced Support Vector Machines: A Statistical Theory
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
Crisp and fuzzy-logic rules are used for comprehensible representation of data, but rules based on similarity to prototypes are equally useful and much less known. Similarity-based methods belong to the most accurate data mining approaches. A large group of such methods is based on instance selection and optimization, with the Learning Vector Quantization (LVQ) algorithm being a prominent example. Accuracy of LVQ depends highly on proper initialization of prototypes and the optimization mechanism. This paper introduces prototype initialization based on context dependent clustering and modification of the LVQ cost function that utilizes additional information about class-dependent distribution of training vectors. This approach is illustrated on several benchmark datasets, finding simple and accurate models of data in the form of prototype-based rules.