Communications of the ACM - Special issue on parallelism
Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Instance-Based Learning Algorithms
Machine Learning
Trading MIPS and memory for knowledge engineering
Communications of the ACM
A practical approach to feature selection
ML92 Proceedings of the ninth international workshop on Machine learning
C4.5: programs for machine learning
C4.5: programs for machine learning
Artificial Intelligence Review - Special issue on lazy learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
A Batch Learning Vector Quantization Algorithm for Nearest Neighbour Classification
Neural Processing Letters
Neural Networks
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Evolutionary Design of Nearest Prototype Classifiers
Journal of Heuristics
Nearest prototype classification: clustering, genetic algorithms, or random search?
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Soft nearest prototype classification
IEEE Transactions on Neural Networks
A new method for MR grayscale inhomogeneity correction
Artificial Intelligence Review
New spatial based MRI image de-noising algorithm
Artificial Intelligence Review
Hybrid random subsample classifier ensemble for high dimensional data sets
International Journal of Hybrid Intelligent Systems
Hi-index | 0.00 |
Nearest prototype approaches offer a common way to design classifiers. However, when data is noisy, the success of this sort of classifiers depends on some parameters that the designer needs to tune, as the number of prototypes. In this work, we have made a study of the ENPC technique, based on the nearest prototype approach, in noisy datasets. Previous experimentation of this algorithm had shown that it does not require any parameter tuning to obtain good solutions in problems where class limits are well defined, and data is not noisy. In this work, we show that the algorithm is able to obtain solutions with high classification success even when data is noisy. A comparison with optimal (hand made) solutions and other different classification algorithms demonstrates the good performance of the ENPC algorithm in accuracy and number of prototypes as the noise level increases. We have performed experiments in four different datasets, each of them with different characteristics.