Parallel distributed processing: explorations in the microstructure of cognition, vol. 1: foundations
Minimisation methods for training feedforward neural networks
Neural Networks
Fast exact multiplication by the Hessian
Neural Computation
Machine Learning
Optimal linear combinations of neural networks
Neural Networks
How to solve it: modern heuristics
How to solve it: modern heuristics
Neural Networks: A Comprehensive Foundation
Neural Networks: A Comprehensive Foundation
Second-Order Methods for Neural Networks
Second-Order Methods for Neural Networks
Ensembling neural networks: many could be better than all
Artificial Intelligence
Artificial Immune Systems: A New Computational Intelligence Paradigm
Artificial Immune Systems: A New Computational Intelligence Paradigm
IEEE Transactions on Pattern Analysis and Machine Intelligence
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Introduction to Evolutionary Computing
Introduction to Evolutionary Computing
Data Mining: Practical Machine Learning Tools and Techniques, Second Edition (Morgan Kaufmann Series in Data Management Systems)
Improving model accuracy using optimal linear combinations of trained neural networks
IEEE Transactions on Neural Networks
On model design for simulation of collective intelligence
Information Sciences: an International Journal
Expert Systems with Applications: An International Journal
A dynamic classifier ensemble selection approach for noise data
Information Sciences: an International Journal
Ectropy of diversity measures for populations in Euclidean space
Information Sciences: an International Journal
Example-based learning particle swarm optimization for continuous optimization
Information Sciences: an International Journal
A two-stage evolutionary algorithm based on sensitivity and accuracy for multi-class problems
Information Sciences: an International Journal
Increasing the efficiency of quicksort using a neural network based algorithm selection model
Information Sciences: an International Journal
A dynamic programming approach to missing data estimation using neural networks
Information Sciences: an International Journal
An Optimization Rule for In Silico Identification of Targeted Overproduction in Metabolic Pathways
IEEE/ACM Transactions on Computational Biology and Bioinformatics (TCBB)
RFID reader anti-collision algorithm using adaptive hierarchical artificial immune system
Expert Systems with Applications: An International Journal
Hi-index | 0.08 |
This paper has three main goals: (i) to employ two classes of algorithms: bio-inspired and gradient-based to train multi-layer perceptron (MLP) neural networks for pattern classification; (ii) to combine the trained neural networks into ensembles of classifiers; and (iii) to investigate the influence of diversity in the classification performance of individual and ensembles of classifiers. The optimization version of an artificial immune network, named opt-aiNet, particle swarm optimization (PSO) and an evolutionary algorithm (EA) are used as bio-inspired methods to train MLP networks. Besides, the standard backpropagation with momentum (BPM), a quasi-Newton method called DFP and a modified scaled-conjugate gradient (SCGM) are the gradient-based algorithms used to train MLP networks in this work. Comparisons among all the training methods are presented in terms of classification accuracy and diversity of the solutions found. The results obtained suggest that most bio-inspired algorithms deteriorate the diversity of solutions during the search, while immune-based methods, like opt-aiNet, and multiple initializations of standard gradient-based algorithms provide diverse solutions that result in good classification accuracy for the ensembles.