A Principal Components Approach to Combining Regression Estimates
Machine Learning
N-Version Genetic Programming via Fault Masking
EuroGP '02 Proceedings of the 5th European Conference on Genetic Programming
Combining Multiple Classifiers in Probabilistic Neural Networks
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Error Rejection in Linearly Combined Multiple Classifiers
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Relationship of Sum and Vote Fusion Strategies
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
Behavioral Diversity and a Probabilistically Optimal GP Ensemble
Genetic Programming and Evolvable Machines
Optimal Control of Fed-Batch Processes Based on Multiple Neural Networks
Applied Intelligence
A Theoretical and Experimental Analysis of Linear Combiners for Multiple Classifier Systems
IEEE Transactions on Pattern Analysis and Machine Intelligence
Interval Generalization of the Bayesian Model of Collective Decision-Making in Conflict Situations
Cybernetics and Systems Analysis
Bio-inspired and gradient-based algorithms to train MLPs: The influence of diversity
Information Sciences: an International Journal
A new technique for combining multiple classifiers using the dempster-shafer theory of evidence
Journal of Artificial Intelligence Research
Ensembles of neural networks with generalization capabilities for vehicle fault diagnostics
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
Feed-forward network training using optimal input gains
IJCNN'09 Proceedings of the 2009 international joint conference on Neural Networks
The influence of diversity in an immune-based algorithm to train MLP networks
ICARIS'07 Proceedings of the 6th international conference on Artificial immune systems
Neuro-fuzzy-combiner: an effective multiple classifier system
International Journal of Knowledge Engineering and Soft Data Paradigms
Neural network ensembles: immune-inspired approaches to the diversity of components
Natural Computing: an international journal
Partial AUC maximization in a linear combination of dichotomizers
Pattern Recognition
A novel strategy for designing efficient multiple classifier
ICB'06 Proceedings of the 2006 international conference on Advances in Biometrics
Remote sensing image classification: a neuro-fuzzy MCS approach
ICVGIP'06 Proceedings of the 5th Indian conference on Computer Vision, Graphics and Image Processing
Hi-index | 0.00 |
Neural network (NN) based modeling often requires trying multiple networks with different architectures and training parameters in order to achieve an acceptable model accuracy. Typically, only one of the trained networks is selected as “best” and the rest are discarded. The authors propose using optimal linear combinations (OLC's) of the corresponding outputs on a set of NN's as an alternative to using a single network. Modeling accuracy is measured by mean squared error (MSE) with respect to the distribution of random inputs. Optimality is defined by minimizing the MSE, with the resultant combination referred to as MSE-OLC. The authors formulate the MSE-OLC problem for trained NN's and derive two closed-form expressions for the optimal combination-weights. An example that illustrates significant improvement in model accuracy as a result of using MSE-OLC's of the trained networks is included