Multilayer neural networks and Bayes decision theory
Neural Networks
Neural Computation
Bayesian decision theory on three-layer neural networks
Neurocomputing
Bayesian learning of neural networks adapted to changes of prior probabilities
ICANN'05 Proceedings of the 15th international conference on Artificial neural networks: formal models and their applications - Volume Part II
Discriminant analysis by a neural network with mahalanobis distance
ICANN'06 Proceedings of the 16th international conference on Artificial Neural Networks - Volume Part II
A new algorithm for learning mahalanobis discriminant functions by a neural network
ICONIP'11 Proceedings of the 18th international conference on Neural Information Processing - Volume Part II
ICONIP'12 Proceedings of the 19th international conference on Neural Information Processing - Volume Part V
Hi-index | 0.02 |
For neural networks, learning from dichotomous random samples is difficult. An example is learning of a Bayesian discriminant function. However, one-hidden-layer neural networks with fewer inner parameters can learn from such signals better than ordinary ones. We show that such neural networks can be used for approximating multi-category Bayesian discriminant functions when the state-conditional probability distributions are two dimensional normal distributions. Results of a simple simulation are shown as examples.