Fusion, propagation, and structuring in belief networks
Artificial Intelligence
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Deterministic Boltzmann learning performs steepest descent in weight-space
Neural Computation
Advances in neural information processing systems 1
Introduction to the theory of neural computation
Introduction to the theory of neural computation
Distributed probabilistic inferencing in sensor networks using variational approximation
Journal of Parallel and Distributed Computing
Variational cumulant expansions for intractable distributions
Journal of Artificial Intelligence Research
Topology selection for stream mining systems
LS-MMRM '09 Proceedings of the First ACM workshop on Large-scale multimedia retrieval and mining
Hi-index | 0.00 |
We introduce a large family of Boltzmann machines that can be trained by standard gradient descent. The networks can have one or more layers of hidden units, with tree-like connectivity. We show how to implement the supervised learning algorithm for these Boltzmann machines exactly, without resort to simulated or mean-field annealing. The stochastic averages that yield the gradients in weight space are computed by the technique of decimation. We present results on the problems of N-bit parity and the detection of hidden symmetries.