The mathematical foundations of learning machines
The mathematical foundations of learning machines
Reducing Communication for Distributed Learning in Neural Networks
ICANN '02 Proceedings of the International Conference on Artificial Neural Networks
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Rectangular basis functions applied to imbalanced datasets
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Hi-index | 0.00 |
Parallel perceptrons (PPs), a novel approach to committee machine training requiring minimal communication between outputs and hidden units, allows the construction of efficient and stable nonlinear classifiers. In this work we shall explore how to improve their performance allowing their output weights to have real values, computed by applying Fisher's linear discriminant analysis to the committee machine's perceptron outputs. We shall see that the final performance of the resulting classifiers is comparable to that of the more complex and costlier to train multilayer perceptrons.