Ensemble learning via negative correlation
Neural Networks
Evolving Multilayer Perceptrons
Neural Processing Letters
Ensembling neural networks: many could be better than all
Artificial Intelligence
Evolutionary Algorithms for Solving Multi-Objective Problems
Evolutionary Algorithms for Solving Multi-Objective Problems
A Memetic Pareto Evolutionary Approach to Artificial Neural Networks
AI '01 Proceedings of the 14th Australian Joint Conference on Artificial Intelligence: Advances in Artificial Intelligence
Speeding up backpropagation using multiobjective evolutionary algorithms
Neural Computation
Solving Multiobjective Optimization Problems Using an Artificial Immune System
Genetic Programming and Evolvable Machines
Multiobjective Evolutionary Algorithms: Analyzing the State-of-the-Art
Evolutionary Computation
Evolutionary multiobjective optimization for generating an ensemble of fuzzy rule-based classifiers
GECCO'03 Proceedings of the 2003 international conference on Genetic and evolutionary computation: PartI
PSFGA: a parallel genetic algorithm for multiobjective optimization
EUROMICRO-PDP'02 Proceedings of the 10th Euromicro conference on Parallel, distributed and network-based processing
Evolutionary ensembles with negative correlation learning
IEEE Transactions on Evolutionary Computation
Cooperative coevolution of artificial neural network ensembles for pattern classification
IEEE Transactions on Evolutionary Computation
Statistical analysis of the parameters of a neuro-genetic algorithm
IEEE Transactions on Neural Networks
Proceedings of the 10th annual conference on Genetic and evolutionary computation
IDEAL'12 Proceedings of the 13th international conference on Intelligent Data Engineering and Automated Learning
Hi-index | 0.00 |
Pattern classification seeks to minimize error of unknown patterns, however, in many real world applications, type I (false positive) and type II (false negative) errors have to be dealt with separately, which is a complex problem since an attempt to minimize one of them usually makes the other grow. Actually, a type of error can be more important than the other, and a trade-off that minimizes the most important error type must be reached. Despite the importance of type-II errors, most pattern classification methods take into account only the global classification error. In this paper we propose to optimize both error types in classification by means of a multiobjective algorithm in which each error type and the network size is an objective of the fitness function. A modified version of the GProp method (optimization and design of multilayer perceptrons) is used, to simultaneously optimize the network size and the type I and II errors.