Neural Computation
Algebraic geometrical methods for hierarchical learning machines
Neural Networks
Algebraic Analysis for Nonidentifiable Learning Machines
Neural Computation
A decision-theoretic extension of stochastic complexity and its applications to learning
IEEE Transactions on Information Theory
Selection of Generative Models in Classification
IEEE Transactions on Pattern Analysis and Machine Intelligence
Singularities Affect Dynamics of Learning in Neuromanifolds
Neural Computation
Neural Networks
Stochastic Complexities of Gaussian Mixtures in Variational Bayesian Approximation
The Journal of Machine Learning Research
Stochastic complexity for mixture of exponential families in generalized variational Bayes
Theoretical Computer Science
Experimental Analysis of Exchange Ratio in Exchange Monte Carlo Method
Neural Information Processing
Upper bound for variational free energy of Bayesian networks
Machine Learning
Accuracy of Loopy belief propagation in Gaussian models
Neural Networks
Generalization error of linear neural networks in an empirical bayes approach
IJCAI'05 Proceedings of the 19th international joint conference on Artificial intelligence
Equations of states in singular statistical estimation
Neural Networks
Marginal Likelihood Integrals for Mixtures of Independence Models
The Journal of Machine Learning Research
Algebraic geometric study of exchange Monte Carlo method
ICANN'07 Proceedings of the 17th international conference on Artificial neural networks
Experimental study of ergodic learning curve in hidden Markov models
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
Design of exchange Monte Carlo method for Bayesian learning in normal mixture models
ICONIP'08 Proceedings of the 15th international conference on Advances in neuro-information processing - Volume Part I
On generalization error of self-organizing map
ICONIP'10 Proceedings of the 17th international conference on Neural information processing: models and applications - Volume Part II
The Journal of Machine Learning Research
Theoretical Analysis of Bayesian Matrix Factorization
The Journal of Machine Learning Research
Stochastic complexity of bayesian networks
UAI'03 Proceedings of the Nineteenth conference on Uncertainty in Artificial Intelligence
Stochastic complexity for mixture of exponential families in variational bayes
ALT'05 Proceedings of the 16th international conference on Algorithmic Learning Theory
Asymptotic behavior of stochastic complexity of complete bipartite graph-type boltzmann machines
ICONIP'06 Proceedings of the 13 international conference on Neural Information Processing - Volume Part I
Generalization performance of exchange monte carlo method for normal mixture models
IDEAL'06 Proceedings of the 7th international conference on Intelligent Data Engineering and Automated Learning
A widely applicable Bayesian information criterion
The Journal of Machine Learning Research
Hi-index | 0.00 |
A learning machine which is a mixture of several distributions, for example, a gaussian mixture or a mixture of experts, has a wide range of applications. However, such a machine is a non-identifiable statistical model with a lot of singularities in the parameter space, hence its generalization property is left unknown. Recently an algebraic geometrical method has been developed which enables us to treat such learning machines mathematically. Based on this method, this paper rigorously proves that a mixture learning machine has the smaller Bayesian stochastic complexity than regular statistical models. Since the generalization error of a learning machine is equal to the increase of the stochastic complexity, the result of this paper shows that the mixture model can attain the more precise prediction than regular statistical models if Bayesian estimation is applied in statistical inference.