Radial basis functions for multivariable interpolation: a review
Algorithms for approximation
Self-organizing maps
Reduction Techniques for Instance-BasedLearning Algorithms
Machine Learning
Swarm intelligence
Advances in Instance Selection for Instance-Based Learning Algorithms
Data Mining and Knowledge Discovery
Dynamic Search With Charged Swarms
GECCO '02 Proceedings of the Genetic and Evolutionary Computation Conference
Evolutionary Design of Nearest Prototype Classifiers
Journal of Heuristics
Particle swarm based Data Mining Algorithms for classification tasks
Parallel Computing - Special issue: Parallel and nature-inspired computational paradigms and applications
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Don't push me! Collision-avoiding swarms
CEC '02 Proceedings of the Evolutionary Computation on 2002. CEC '02. Proceedings of the 2002 Congress - Volume 02
A hybrid PSO/ACO algorithm for classification
Proceedings of the 9th annual conference companion on Genetic and evolutionary computation
Generating Fuzzy Rules from Examples Using the Particle Swarm Optimization Algorithm
HIS '07 Proceedings of the 7th International Conference on Hybrid Intelligent Systems
Classifier fitness based on accuracy
Evolutionary Computation
A PSO-Based Classification Rule Mining Algorithm
ICIC '07 Proceedings of the 3rd International Conference on Intelligent Computing: Advanced Intelligent Computing Theories and Applications. With Aspects of Artificial Intelligence
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartII
Classification rule mining based on particle swarm optimization
RSKT'06 Proceedings of the First international conference on Rough Sets and Knowledge Technology
IEEE Transactions on Systems, Man, and Cybernetics, Part B: Cybernetics
IEEE Transactions on Neural Networks
Chaotic maps based on binary particle swarm optimization for feature selection
Applied Soft Computing
Bio-inspired algorithms for autonomous deployment and localization of sensor nodes
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
IPADE: iterative prototype adjustment for nearest neighbor classification
IEEE Transactions on Neural Networks
A multi-objective decision making model for the vendor selection problem in a bifuzzy environment
Expert Systems with Applications: An International Journal
Self-adaptive learning based particle swarm optimization
Information Sciences: an International Journal
Enhancing IPADE algorithm with a different individual codification
HAIS'11 Proceedings of the 6th international conference on Hybrid artificial intelligent systems - Volume Part II
Diversity enhanced particle swarm optimization with neighborhood search
Information Sciences: an International Journal
Particle swarm classification: A survey and positioning
Pattern Recognition
Evolutionary computation for supervised learning
Proceedings of the 15th annual conference companion on Genetic and evolutionary computation
Hi-index | 0.00 |
Nearest prototype methods can be quite successful on many pattern classification problems. In these methods, a collection of prototypes has to be found that accurately represents the input patterns. The classifier then assigns classes based on the nearest prototype in this collection. In this paper, we first use the standard particle swarm optimizer (PSO) algorithm to find those prototypes. Second, we present a new algorithm, called adaptive Michigan PSO (AMPSO) in order to reduce the dimension of the search space and provide more flexibility than the former in this application. AMPSO is based on a different approach to particle swarms as each particle in the swarm represents a single prototype in the solution. The swarm does not converge to a single solution; instead, each particle is a local classifier, and the whole swarm is taken as the solution to the problem. It uses modified PSO equations with both particle competition and cooperation and a dynamic neighborhood. As an additional feature, in AMPSO, the number of prototypes represented in the swarm is able to adapt to the problem, increasing as needed the number of prototypes and classes of the prototypes that make the solution to the problem. We compared the results of the standard PSO and AMPSO in several benchmark problems from the University of California, Irvine, data sets and find that AMPSO always found a better solution than the standard PSO. We also found that it was able to improve the results of the Nearest Neighbor classifiers, and it is also competitive with some of the algorithms most commonly used for classification.