Connectionist learning of belief networks
Artificial Intelligence
Minimax entropy principle and its application to texture modeling
Neural Computation
Additive models, boosting, and inference for generalized divergences
COLT '99 Proceedings of the twelfth annual conference on Computational learning theory
Dynamic stochastic synapses as computational units
Neural Computation
Learning in Gibbsian Fields: How Accurate and How Fast Can It Be?
IEEE Transactions on Pattern Analysis and Machine Intelligence
Training products of experts by minimizing contrastive divergence
Neural Computation
Associative memory with dynamic synapses
Neural Computation
Bayesian learning in undirected graphical models: approximate MCMC algorithms
UAI '04 Proceedings of the 20th conference on Uncertainty in artificial intelligence
Estimation of Non-Normalized Statistical Models by Score Matching
The Journal of Machine Learning Research
Training restricted Boltzmann machines using approximations to the likelihood gradient
Proceedings of the 25th international conference on Machine learning
Herding dynamic weights for partially observed random field models
UAI '09 Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence
Quickly generating representative samples from an rbm-derived process
Neural Computation
An efficient learning procedure for deep boltzmann machines
Neural Computation
From machine learning to machine reasoning
Machine Learning
Hi-index | 0.00 |
A new "herding" algorithm is proposed which directly converts observed moments into a sequence of pseudo-samples. The pseudo-samples respect the moment constraints and may be used to estimate (unobserved) quantities of interest. The procedure allows us to sidestep the usual approach of first learning a joint model (which is intractable) and then sampling from that model (which can easily get stuck in a local mode). Moreover, the algorithm is fully deterministic, avoiding random number generation) and does not need expensive operations such as exponentiation.