A massively parallel architecture for a self-organizing neural pattern recognition machine
Computer Vision, Graphics, and Image Processing
Construction of fuzzy classification systems with rectangular fuzzy rules using genetic algorithms
Fuzzy Sets and Systems - Special issue on fuzzy methods for computer vision and pattern recognition
Fast training of support vector machines using sequential minimal optimization
Advances in kernel methods
Genetic Algorithms in Search, Optimization and Machine Learning
Genetic Algorithms in Search, Optimization and Machine Learning
Three objective genetics-based machine learning for linguisitc rule extraction
Information Sciences: an International Journal - Recent advances in genetic fuzzy systems
Cross-validation in Fuzzy ARTMAP for large databases
Neural Networks
A Tutorial on Support Vector Machines for Pattern Recognition
Data Mining and Knowledge Discovery
Combining convergence and diversity in evolutionary multiobjective optimization
Evolutionary Computation
Multiple Objective Optimization with Vector Evaluated Genetic Algorithms
Proceedings of the 1st International Conference on Genetic Algorithms
Varying the Probability of Mutation in the Genetic Algorithm
Proceedings of the 3rd International Conference on Genetic Algorithms
Genetic Algorithms for Multiobjective Optimization: FormulationDiscussion and Generalization
Proceedings of the 5th International Conference on Genetic Algorithms
An analysis of the behavior of a class of genetic adaptive systems.
An analysis of the behavior of a class of genetic adaptive systems.
Novel approaches in adaptive resonance theory for machine learning
Novel approaches in adaptive resonance theory for machine learning
Multi-class ROC analysis from a multi-objective optimisation perspective
Pattern Recognition Letters - Special issue: ROC analysis in pattern recognition
Working Set Selection Using Second Order Information for Training Support Vector Machines
The Journal of Machine Learning Research
GFAM: Evolving Fuzzy ARTMAP neural networks
Neural Networks
Predictive models for the breeder genetic algorithm i. continuous parameter optimization
Evolutionary Computation
Muiltiobjective optimization using nondominated sorting in genetic algorithms
Evolutionary Computation
Genetic optimization of art neural network architectures
ASC '07 Proceedings of The Eleventh IASTED International Conference on Artificial Intelligence and Soft Computing
Evolutionary multi-objective optimization of spiking neural networks
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Evolutionary multi-objective optimization: a historical view of the field
IEEE Computational Intelligence Magazine
Pareto-Based Multiobjective Machine Learning: An Overview and Case Studies
IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews
Multiobjective evolutionary algorithms: a comparative case studyand the strength Pareto approach
IEEE Transactions on Evolutionary Computation
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Using unconstrained elite archives for multiobjective optimization
IEEE Transactions on Evolutionary Computation
Selecting fuzzy if-then rules for classification problems using genetic algorithms
IEEE Transactions on Fuzzy Systems
μARTMAP: use of mutual information for category reduction in Fuzzy ARTMAP
IEEE Transactions on Neural Networks
Learning in the feed-forward random neural network: A critical review
Performance Evaluation
Multiobjective learning in the random neural network
International Journal of Advanced Intelligence Paradigms
Hi-index | 0.00 |
In this paper, we present the evolution of adaptive resonance theory (ART) neural network architectures (classifiers) using a multiobjective optimization approach. In particular, we propose the use of a multiobjective evolutionary approach to simultaneously evolve the weights and the topology of three well-known ART architectures; fuzzy ARTMAP (FAM), ellipsoidal ARTMAP (EAM), and Gaussian ARTMAP (GAM). We refer to the resulting architectures as MO-GFAM, MO-GEAM, and MO-GGAM, and collectively as MO-GART. The major advantage of MO-GART is that it produces a number of solutions for the classification problem at hand that have different levels of merit [accuracy on unseen data (generalization) and size (number of categories created)]. MO-GART is shown to be more elegant (does not require user intervention to define the network parameters), more effective (of better accuracy and smaller size), and more efficient (faster to produce the solution networks) than other ART neural network architectures that have appeared in the literature. Furthermore, MO-GART is shown to be competitive with other popular classifiers, such as classification and regression tree (CART) and support vector machines (SVMs).