Elements of information theory
Elements of information theory
Keeping the neural networks simple by minimizing the description length of the weights
COLT '93 Proceedings of the sixth annual conference on Computational learning theory
Machine Learning
A decision-theoretic generalization of on-line learning and an application to boosting
Journal of Computer and System Sciences - Special issue: 26th annual ACM symposium on the theory of computing & STOC'94, May 23–25, 1994, and second annual Europe an conference on computational learning theory (EuroCOLT'95), March 13–15, 1995
Pattern Recognition and Neural Networks
Pattern Recognition and Neural Networks
Square Unit Augmented, Radially Extended, Multilayer Perceptrons
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Analysis of Linear and Order Statistics Combiners for Fusion of Imbalanced Classifiers
MCS '02 Proceedings of the Third International Workshop on Multiple Classifier Systems
Expert Systems with Applications: An International Journal
Hi-index | 0.00 |
We study various ensemble methods for hybrid neural networks. The hybrid networks are composed of radial and projection units and are trained using a deterministic algorithm that completely defines the parameters of the network for a given data set. Thus, there is no random selection of the initial (and final) parameters as in other training algorithms. Network independent is achieved by using bootstrap and boosting methods as well as random input sub-space sampling. The fusion methods are evaluated on several classification benchmark data-sets. A novel MDL based fusion method appears to reduce the variance of the classification scheme and sometimes be superior in its overall performance.