Higher-order Boltzmann machines
AIP Conference Proceedings 151 on Neural Networks for Computing
Simulated annealing and Boltzmann machines: a stochastic approach to combinatorial optimization and neural computing
The EM algorithm for graphical association models with missing data
Computational Statistics & Data Analysis - Special issue dedicated to Toma´sˇ Havra´nek
Inducing Features of Random Fields
IEEE Transactions on Pattern Analysis and Machine Intelligence
Deterministic annealing EM algorithm
Neural Networks
Efficient learning in Boltzmann machines using linear response theory
Neural Computation
Logistic Regression, AdaBoost and Bregman Distances
Machine Learning
Approximate inference in Boltzmann machines
Artificial Intelligence
Training products of experts by minimizing contrastive divergence
Neural Computation
Tree-based reparameterization framework for analysis of sum-product and related algorithms
IEEE Transactions on Information Theory
Scalable Model-Based Clustering for Large Databases Based on Data Summarization
IEEE Transactions on Pattern Analysis and Machine Intelligence
The Latent Maximum Entropy Principle
ACM Transactions on Knowledge Discovery from Data (TKDD)
Hi-index | 0.00 |
We present a new statistical learning paradigm for Boltzmann machines based on a new inference principle we have proposed: the latent maximum entropy principle (LME). LME is different both from Jaynes' maximum entropy principle and from standard maximum likelihood estimation. We demonstrate the LME principle by deriving new algorithms for Boltzmann machine parameter estimation, and show how a robust and rapidly convergent new variant of the EM algorithm can be developed. Our experiments show that estimation based on LME generally yields better results than maximum likelihood estimation when inferring models from small amounts of data.