Fusion, propagation, and structuring in belief networks
Artificial Intelligence
Towards practical `neural' computation for combinatorial optimization problems
AIP Conference Proceedings 151 on Neural Networks for Computing
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic reasoning in intelligent systems: networks of plausible inference
Probabilistic inference and influence diagrams
Operations Research
Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
Neurocomputing: foundations of research
Neurocomputing: foundations of research
Connectionist learning procedures
Artificial Intelligence
Readings in uncertain reasoning
Readings in uncertain reasoning
Optimization by mean field annealing
Advances in neural information processing systems 1
Adapting connectionist learning to Bayes networks
International Journal of Approximate Reasoning
Parallel simulated annealing techniques
CNLS '89 Proceedings of the ninth annual international conference of the Center for Nonlinear Studies on Self-organizing, Collective, and Cooperative Phenomena in Natural and Artificial Computing Networks on Emergent computation
Gibbs sampling in Bayesian networks (research note)
Artificial Intelligence
Information processing in dynamical systems: foundations of harmony theory
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Learning and relearning in Boltzmann machines
Parallel distributed processing: explorations in the microstructure of cognition, vol. 1
Compiling object declarations into connectionist networks
AI Communications
Connectionist learning of belief networks
Artificial Intelligence
Nested annealing: a provable improvement to simulated annealing
Theoretical Computer Science
A volatility measure for annealing in feedback neural networks
Neural Computation
Neurocomputing (vol. 2): directions for research
Neurocomputing (vol. 2): directions for research
Integrating rules and connectionism for robust commonsense reasoning
Integrating rules and connectionism for robust commonsense reasoning
Finding MAPs for belief networks is NP-hard
Artificial Intelligence
Introduction to Bayesian Networks
Introduction to Bayesian Networks
Intelligent Hybrid Systems
Semantic Networks: An Evidential Formalization and Its Connectionist Realization
Semantic Networks: An Evidential Formalization and Its Connectionist Realization
An Introduction to Algorithms for Inference in Belief Nets
UAI '89 Proceedings of the Fifth Annual Conference on Uncertainty in Artificial Intelligence
Neural Networks and Structured Knowledge: Rule Extraction andApplications
Applied Intelligence
Hi-index | 0.00 |
We present a method for mapping a given Bayesian network to aBoltzmann machine architecture, in the sense that the the updatingprocess of the resulting Boltzmann machine model probably converges toa state which can be mapped back to a maximum a posteriori (MAP)probability state in the probability distribution represented by theBayesian network. The Boltzmann machine model can be implementedefficiently on massively parallel hardware, since the resultingstructure can be divided into two separate clusters where all thenodes in one cluster can be updated simultaneously. This means thatthe proposed mapping can be used for providing Bayesian networkmodels with a massively parallel probabilistic reasoning module,capable of finding the MAP states in a computationally efficientmanner. From the neural network point of view, the mapping from aBayesian network to a Boltzmann machine can be seen as a method forautomatically determining the structure and the connection weights ofa Boltzmann machine by incorporating high-level, probabilisticinformation directly into the neural network architecture, withoutrecourse to a time-consuming and unreliable learning process.