Introduction to statistical pattern recognition (2nd ed.)
Introduction to statistical pattern recognition (2nd ed.)
A practical Bayesian framework for backpropagation networks
Neural Computation
Natural gradient works efficiently in learning
Neural Computation
Assessing the importance of features for multi-layer perceptrons
Neural Networks
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Fast Branch & Bound Algorithms for Optimal Feature Selection
IEEE Transactions on Pattern Analysis and Machine Intelligence
On "Natural" Learning and Pruning in Multilayered Perceptrons
Neural Computation
White box radial basis function classifiers with component selection for clinical prediction models
Artificial Intelligence in Medicine
Hi-index | 0.00 |
In this paper, a novel information geometric-based variable selection criterion for multi-layer perceptron networks is described. It is based on projections of the Riemannian manifold defined by a multi-layer perceptron network on submanifolds defined by multi-layer perceptron networks with reduced input dimension. We show how the divergence between models can be used as a criterion for an efficient search in the space of networks with different inputs. Then, we show how the posterior probabilities of the models can be evaluated to rank the projected models. Finally, we test our algorithm on synthetic and real data, and compare its performances with other methods reported in literature.