Soft competitive adaptation: neural network learning algorithms based on fitting statistical mixtures
A Simple Neural Network Pruning Algorithm with Application to Filter Synthesis
Neural Processing Letters
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Second Order Derivatives for Network Pruning: Optimal Brain Surgeon
Advances in Neural Information Processing Systems 5, [NIPS Conference]
Square Unit Augmented, Radially Extended, Multilayer Perceptrons
Neural Networks: Tricks of the Trade, this book is an outgrowth of a 1996 NIPS workshop
Automatic Model Selection in a Hybrid Perceptron/Radial Network
MCS '01 Proceedings of the Second International Workshop on Multiple Classifier Systems
A Hybrid Projection Based and Radial Basis Function Architecture
MCS '00 Proceedings of the First International Workshop on Multiple Classifier Systems
Pattern Classification (2nd Edition)
Pattern Classification (2nd Edition)
Adaptive mixtures of local experts
Neural Computation
Improving performance of a multiple classifier system using self-generating neural networks
MCS'03 Proceedings of the 4th international conference on Multiple classifier systems
Playing monotone games to understand learning behaviors
Theoretical Computer Science
Hi-index | 0.00 |
We introduce a Forward Backward and Model Selection algorithm (FBMS) for constructing a hybrid regression network of radial and perceptron hidden units. The algorithm determines whether a radial or a perceptron unit is required at a given region of input space. Given an error target, the algorithm also determines the number of hidden units. Then the algorithm uses model selection criteria and prunes unnecessary weights. This results in a final architecture which is often much smaller than a RBF network or a MLP. Results for various data sizes on the Pumadyn data indicate that the resulting architecture competes and often outperform best known results for this data set.