Self-organization and associative memory: 3rd edition
Self-organization and associative memory: 3rd edition
Neural Networks
Instance-Based Learning Algorithms
Machine Learning
Vector quantization and signal compression
Vector quantization and signal compression
C4.5: programs for machine learning
C4.5: programs for machine learning
Data mining: practical machine learning tools and techniques with Java implementations
Data mining: practical machine learning tools and techniques with Java implementations
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
A Batch Learning Vector Quantization Algorithm for Nearest Neighbour Classification
Neural Processing Letters
Neural Networks
Generating Accurate Rule Sets Without Global Optimization
ICML '98 Proceedings of the Fifteenth International Conference on Machine Learning
Estimating continuous distributions in Bayesian classifiers
UAI'95 Proceedings of the Eleventh conference on Uncertainty in artificial intelligence
Multiple-prototype classifier design
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Nearest prototype classification: clustering, genetic algorithms, or random search?
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Evolutionary learning of nearest-neighbor MLP
IEEE Transactions on Neural Networks
Probabilistic neural-network structure determination for pattern classification
IEEE Transactions on Neural Networks
A Reinforcement Learning Algorithm in Cooperative Multi-Robot Domains
Journal of Intelligent and Robotic Systems
Early bankruptcy prediction using ENPC
Applied Intelligence
An Adaptive Michigan Approach PSO for Nearest Prototype Classification
IWINAC '07 Proceedings of the 2nd international work-conference on Nature Inspired Problem-Solving Methods in Knowledge Engineering: Interplay Between Natural and Artificial Computation, Part II
Prototypes Based Relational Learning
AIMSA '08 Proceedings of the 13th international conference on Artificial Intelligence: Methodology, Systems, and Applications
Particle swarm optimization based multi-prototype ensembles
Proceedings of the 11th Annual conference on Genetic and evolutionary computation
AMPSO: a new particle swarm method for nearest neighborhood classification
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
Nearest prototype classification of noisy data
Artificial Intelligence Review
Differential Evolution Classifier in Noisy Settings and with Interacting Variables
Applied Soft Computing
IPADE: iterative prototype adjustment for nearest neighbor classification
IEEE Transactions on Neural Networks
Optimized distance metrics for differential evolution based nearest prototype classifier
Expert Systems with Applications: An International Journal
MICAI'12 Proceedings of the 11th Mexican international conference on Advances in Artificial Intelligence - Volume Part I
Hi-index | 0.00 |
In pattern classification problems, many works have been carried out with the aim of designing good classifiers from different perspectives. These works achieve very good results in many domains. However, in general they are very dependent on some crucial parameters involved in the design. These parameters have to be found by a trial and error process or by some automatic methods, like heuristic search and genetic algorithms, that strongly decrease the performance of the method. For instance, in nearest prototype approaches, main parameters are the number of prototypes to use, the initial set, and a smoothing parameter. In this work, an evolutionary approach based on Nearest Prototype Classifier (ENPC) is introduced where no parameters are involved, thus overcoming all the problems that classical methods have in tuning and searching for the appropiate values. The algorithm is based on the evolution of a set of prototypes that can execute several operators in order to increase their quality in a local sense, and with a high classification accuracy emerging for the whole classifier. This new approach has been tested using four different classical domains, including such artificial distributions as spiral and uniform distibuted data sets, the Iris Data Set and an application domain about diabetes. In all the cases, the experiments show successfull results, not only in the classification accuracy, but also in the number and distribution of the prototypes achieved.