Original Contribution: Stacked generalization
Neural Networks
Combining the results of several neural network classifiers
Neural Networks
Democracy in neural nets: voting schemes for classification
Neural Networks
Methods for combining experts' probability assessments
Neural Computation
Machine Learning
Combination of Multiple Classifiers Using Local Accuracy Estimates
IEEE Transactions on Pattern Analysis and Machine Intelligence
Improving Performance in Neural Networks Using a Boosting Algorithm
Advances in Neural Information Processing Systems 5, [NIPS Conference]
A decision-theoretic generalization of on-line learning and an application to boosting
EuroCOLT '95 Proceedings of the Second European Conference on Computational Learning Theory
Combining Pattern Classifiers: Methods and Algorithms
Combining Pattern Classifiers: Methods and Algorithms
Adaptive mixtures of local experts
Neural Computation
CCHR: Combination of Classifiers Using Heuristic Retraining
NCM '08 Proceedings of the 2008 Fourth International Conference on Networked Computing and Advanced Information Management - Volume 02
Local linear perceptrons for classification
IEEE Transactions on Neural Networks
Combining classifiers using nearest decision prototypes
Applied Soft Computing
Hi-index | 12.05 |
In this paper, a new method in classifier fusion is introduced for decision making based on internal structure of base classifiers. Amongst methods used in combining classifiers, there are some methods which work on decision template as a tool for modeling behavior of base classifiers in order to label data. This tool models their behavior only based on their final outputs. Our new method, introduces a special structure for decision template such that internal behavior of a neural network base classifier can be modeled in a proper manner suitable for classifiers fusion. The new method builds decision template for each layer of the neural network including all hidden layers. Therefore, the process of making decision in each base classifier is also available for classifiers fusion. Efficiency of the new method is compared with some known benchmark datasets to show how it can improve efficiency of classifiers fusion.