Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
A practical Bayesian framework for backpropagation networks
Neural Computation
Bayesian Learning for Neural Networks
Bayesian Learning for Neural Networks
Hi-index | 0.00 |
Generally the neural networks employing Bayesian decision do not output one simple hypothesis, but a manifold of probability distributions. This throws out the bayes posterior coefficients as a large number of classifiers. Here a novel method based on differential feedback is explored to merge these classifiers. The experimental results confirm affine transportation of these classifiers. Also, it has been shown that the differentially fed Artificial Neural Networks (ANNs) learn in much the same way as Bayesian learning and are hence resistant to over fitting.