Machine Learning
Optimal linear combinations of neural networks
Neural Networks
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Multi-Objective Optimization Using Evolutionary Algorithms
Multi-Objective Optimization Using Evolutionary Algorithms
Ensembling neural networks: many could be better than all
Artificial Intelligence
Artificial Immune Systems: A New Computational Intelligence Paradigm
Artificial Immune Systems: A New Computational Intelligence Paradigm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Negative correlation learning and evolutionary design of neural network ensembles
Negative correlation learning and evolutionary design of neural network ensembles
Speeding up backpropagation using multiobjective evolutionary algorithms
Neural Computation
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
An artificial immune network for multimodal function optimization on dynamic environments
GECCO '05 Proceedings of the 7th annual conference on Genetic and evolutionary computation
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Muiltiobjective optimization using nondominated sorting in genetic algorithms
Evolutionary Computation
The influence of diversity in an immune-based algorithm to train MLP networks
ICARIS'07 Proceedings of the 6th international conference on Artificial immune systems
Omni-aiNet: an immune-inspired approach for omni optimization
ICARIS'06 Proceedings of the 5th international conference on Artificial Immune Systems
Omni-optimizer: a procedure for single and multi-objective optimization
EMO'05 Proceedings of the Third international conference on Evolutionary Multi-Criterion Optimization
Multiobjective evolutionary algorithms: a comparative case studyand the strength Pareto approach
IEEE Transactions on Evolutionary Computation
A fast and elitist multiobjective genetic algorithm: NSGA-II
IEEE Transactions on Evolutionary Computation
Improving model accuracy using optimal linear combinations of trained neural networks
IEEE Transactions on Neural Networks
Hi-index | 0.00 |
This work applies two immune-inspired algorithms, namely opt-aiNet and omni-aiNet, to train multi-layer perceptrons (MLPs) to be used in the construction of ensembles of classifiers. The main goal is to investigate the influence of the diversity of the set of solutions generated by each of these algorithms, and if these solutions lead to improvements in performance when combined in ensembles. omni-aiNet is a multi-objective optimization algorithm and, thus, explicitly maximizes the components' diversity at the same time it minimizes their output errors. The opt-aiNet algorithm, by contrast, was originally designed to solve single-objective optimization problems, focusing on the minimization of the output error of the classifiers. However, an implicit diversity maintenance mechanism stimulates the generation of MLPs with different weights, which may result in diverse classifiers. The performances of opt-aiNet and omni-aiNet are compared with each other and with that of a second-order gradient-based algorithm, named MSCG. The results obtained show how the different diversity maintenance mechanisms presented by each algorithm influence the gain in performance obtained with the use of ensembles.